🏠 Hem
Benchmarkar
📊 Alla benchmarkar 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List-applikationer 🎨 Kreativa fria sidor 🎯 FSACB - Ultimata uppvisningen 🌍 Översättningsbenchmark
Modeller
🏆 Topp 10 modeller 🆓 Gratis modeller 📋 Alla modeller ⚙️ Kilo Code
Resurser
💬 Promptbibliotek 📖 AI-ordlista 🔗 Användbara länkar

AI-ordlista

Den kompletta ordlistan över AI

162
kategorier
2 032
underkategorier
23 060
termer
📖
termer

Teacher-Student Architecture

Framework where a teacher model trains a student model by transferring its implicit knowledge through soft targets and regularizations.

📖
termer

Feature Map Distillation

Knowledge transfer method at the level of intermediate model representations rather than at the final prediction level.

📖
termer

Attention Transfer

Transfer of attention maps from the teacher to the student to preserve the important regions identified by the complex model.

📖
termer

Relation Knowledge Distillation

Approach preserving structural relationships between training samples rather than individual knowledge.

📖
termer

Self-Distillation

Process where a model self-improves by transferring its knowledge to a deeper or wider version of itself.

📖
termer

Progressive Distillation

Iterative distillation method where the student gradually becomes the teacher for even more compact models.

📖
termer

Online Knowledge Distillation

Approach where multiple models train each other in real-time without requiring a pre-trained teacher.

📖
termer

Cross-Domain Distillation

Technique for transferring knowledge between models operating on different domains but sharing similar underlying structures.

📖
termer

Lifelong Learning via Distillation

Application of distillation to preserve acquired knowledge during continuous learning and avoid catastrophic forgetting.

📖
termer

Ensemble Distillation

Compression of an ensemble of models into a single compact model preserving the diversity of collective knowledge.

📖
termer

Neural Architecture Search with Distillation

Integration of distillation into the NAS process to guide the search for efficient architectures preserving performance.

📖
termer

Contrastive Knowledge Distillation

Approach using positive and negative contrasts to transfer discriminative representations from teacher to student.

🔍

Inga resultat hittades