🏠 Home
Benchmark
📊 Tutti i benchmark 🦖 Dinosauro v1 🦖 Dinosauro v2 ✅ App To-Do List 🎨 Pagine libere creative 🎯 FSACB - Ultimate Showcase 🌍 Benchmark traduzione
Modelli
🏆 Top 10 modelli 🆓 Modelli gratuiti 📋 Tutti i modelli ⚙️ Kilo Code
Risorse
💬 Libreria di prompt 📖 Glossario IA 🔗 Link utili

Glossario IA

Il dizionario completo dell'Intelligenza Artificiale

162
categorie
2.032
sottocategorie
23.060
termini
📖
termini

Co-Attention

Bidirectional mechanism where two modalities or sequences mutually attend to establish cross-correlations. Fundamental in multi-modal tasks like VQA (Visual Question Answering).

📖
termini

Attention Pyramid Network

Pyramidal architecture integrating attention mechanisms at each hierarchical level for progressive information aggregation. Enables efficient fusion of multi-scale features with adaptive attention weights.

📖
termini

Cascaded Attention

Sequential chaining of attention layers where the output of one layer feeds into the next with progressive refinement. Enables fine modeling of complex dependencies through multiple attention steps.

📖
termini

Hierarchical Feature Learning

Process of feature extraction at multiple abstraction levels, from pixel/token to high-level concepts. Naturally integrated with hierarchical attention for structured data representation.

📖
termini

Multi-Level Attention Fusion

Technique combining outputs of attention mechanisms at different hierarchical levels through adaptive weighting. Optimizes integration of multi-scale contextual information into a unified representation.

📖
termini

Hierarchical Self-Attention

Extension of self-attention applied recursively on hierarchical groupings of tokens or segments. Enables efficient modeling of long-range dependencies in structured documents.

📖
termini

Global-Local Attention

Hybrid architecture combining attention over the entire sequence with focused attention on local segments. Effectively balances global context perception with fine details.

📖
termini

Hierarchical Cross-Attention

Mechanism where a hierarchy of queries attends to a hierarchy of keys/values for multi-level interaction. Essential in translation and generation tasks with hierarchical structures.

📖
termini

Pyramid Attention Module

Specific module integrating an attention pyramid with progressive reduction rates for computational efficiency. Optimizes the performance/cost ratio in vision-transformer models.

📖
termini

Hierarchical Attention Network

Complete architecture based on stacking hierarchical attention mechanisms for structured data processing. Pioneer in sentiment analysis and document classification.

📖
termini

Multi-Granularity Attention

Approach applying attention simultaneously on different data granularities (words, sentences, paragraphs). Allows nuanced text understanding at multiple semantic levels.

📖
termini

Hierarchical Attention Routing

Mechanism dynamically directing information through a hierarchy based on attention scores. Optimizes information flow in deep neural networks with tree-like structures.

🔍

Nessun risultato trovato