🏠 Home
Benchmark
📊 Tutti i benchmark 🦖 Dinosauro v1 🦖 Dinosauro v2 ✅ App To-Do List 🎨 Pagine libere creative 🎯 FSACB - Ultimate Showcase 🌍 Benchmark traduzione
Modelli
🏆 Top 10 modelli 🆓 Modelli gratuiti 📋 Tutti i modelli ⚙️ Kilo Code
Risorse
💬 Libreria di prompt 📖 Glossario IA 🔗 Link utili

Glossario IA

Il dizionario completo dell'Intelligenza Artificiale

162
categorie
2.032
sottocategorie
23.060
termini
📖
termini

Teacher-Student Architecture

Framework where a teacher model trains a student model by transferring its implicit knowledge through soft targets and regularizations.

📖
termini

Feature Map Distillation

Knowledge transfer method at the level of intermediate model representations rather than at the final prediction level.

📖
termini

Attention Transfer

Transfer of attention maps from the teacher to the student to preserve the important regions identified by the complex model.

📖
termini

Relation Knowledge Distillation

Approach preserving structural relationships between training samples rather than individual knowledge.

📖
termini

Self-Distillation

Process where a model self-improves by transferring its knowledge to a deeper or wider version of itself.

📖
termini

Progressive Distillation

Iterative distillation method where the student gradually becomes the teacher for even more compact models.

📖
termini

Online Knowledge Distillation

Approach where multiple models train each other in real-time without requiring a pre-trained teacher.

📖
termini

Cross-Domain Distillation

Technique for transferring knowledge between models operating on different domains but sharing similar underlying structures.

📖
termini

Lifelong Learning via Distillation

Application of distillation to preserve acquired knowledge during continuous learning and avoid catastrophic forgetting.

📖
termini

Ensemble Distillation

Compression of an ensemble of models into a single compact model preserving the diversity of collective knowledge.

📖
termini

Neural Architecture Search with Distillation

Integration of distillation into the NAS process to guide the search for efficient architectures preserving performance.

📖
termini

Contrastive Knowledge Distillation

Approach using positive and negative contrasts to transfer discriminative representations from teacher to student.

🔍

Nessun risultato trovato