🏠 Home
Benchmark
📊 Tutti i benchmark 🦖 Dinosauro v1 🦖 Dinosauro v2 ✅ App To-Do List 🎨 Pagine libere creative 🎯 FSACB - Ultimate Showcase 🌍 Benchmark traduzione
Modelli
🏆 Top 10 modelli 🆓 Modelli gratuiti 📋 Tutti i modelli ⚙️ Kilo Code
Risorse
💬 Libreria di prompt 📖 Glossario IA 🔗 Link utili

Glossario IA

Il dizionario completo dell'Intelligenza Artificiale

162
categorie
2.032
sottocategorie
23.060
termini
📂
sottocategorie

Attention Mechanism

Mathematical foundation allowing models to weight the relative importance of elements in a data sequence.

5 termini
📂
sottocategorie

Self-Attention

Mechanism where each element of a sequence computes its attention relative to all other elements in the same sequence.

0 termini
📂
sottocategorie

Multi-Head Attention

Attention extension using multiple attention heads in parallel to capture different types of relationships.

3 termini
📂
sottocategorie

Positional Encoding

Technique for incorporating the sequential position of elements into embeddings without using an RNN.

12 termini
📂
sottocategorie

Encoder-Decoder Architecture

Fundamental structure of Transformers separating input processing (encoder) and output generation (decoder).

2 termini
📂
sottocategorie

Attention Scaling

Square root of dimensionality normalization to stabilize training and prevent exploding gradients.

14 termini
📂
sottocategorie

Cross-Attention

Attention mechanism between two different sequences, used in translation and multimodal tasks.

8 termini
📂
sottocategorie

Sparse Attention

Variant of attention computed only on a subset of positions to reduce computational complexity.

3 termini
📂
sottocategorie

Attention Masks

Control mechanisms allowing to mask certain positions during attention computation to prevent information leakage.

9 termini
📂
sottocategorie

Vision Transformers

Adaptation of the Transformer architecture to computer vision tasks by treating images as sequences of patches.

9 termini
📂
sottocategorie

Efficient Attention

Set of optimizations aimed at reducing the quadratic complexity of standard attention for longer sequences.

2 termini
📂
sottocategorie

Hierarchical Attention

Multi-level attention structure capturing relationships at different hierarchical scales in the data.

12 termini
🔍

Nessun risultato trovato