Glossario IA
Il dizionario completo dell'Intelligenza Artificiale
Co-Attention
Bidirectional mechanism where two modalities or sequences mutually attend to establish cross-correlations. Fundamental in multi-modal tasks like VQA (Visual Question Answering).
Attention Pyramid Network
Pyramidal architecture integrating attention mechanisms at each hierarchical level for progressive information aggregation. Enables efficient fusion of multi-scale features with adaptive attention weights.
Cascaded Attention
Sequential chaining of attention layers where the output of one layer feeds into the next with progressive refinement. Enables fine modeling of complex dependencies through multiple attention steps.
Hierarchical Feature Learning
Process of feature extraction at multiple abstraction levels, from pixel/token to high-level concepts. Naturally integrated with hierarchical attention for structured data representation.
Multi-Level Attention Fusion
Technique combining outputs of attention mechanisms at different hierarchical levels through adaptive weighting. Optimizes integration of multi-scale contextual information into a unified representation.
Hierarchical Self-Attention
Extension of self-attention applied recursively on hierarchical groupings of tokens or segments. Enables efficient modeling of long-range dependencies in structured documents.
Global-Local Attention
Hybrid architecture combining attention over the entire sequence with focused attention on local segments. Effectively balances global context perception with fine details.
Hierarchical Cross-Attention
Mechanism where a hierarchy of queries attends to a hierarchy of keys/values for multi-level interaction. Essential in translation and generation tasks with hierarchical structures.
Pyramid Attention Module
Specific module integrating an attention pyramid with progressive reduction rates for computational efficiency. Optimizes the performance/cost ratio in vision-transformer models.
Hierarchical Attention Network
Complete architecture based on stacking hierarchical attention mechanisms for structured data processing. Pioneer in sentiment analysis and document classification.
Multi-Granularity Attention
Approach applying attention simultaneously on different data granularities (words, sentences, paragraphs). Allows nuanced text understanding at multiple semantic levels.
Hierarchical Attention Routing
Mechanism dynamically directing information through a hierarchy based on attention scores. Optimizes information flow in deep neural networks with tree-like structures.