🏠 Strona Główna
Benchmarki
📊 Wszystkie benchmarki 🦖 Dinozaur v1 🦖 Dinozaur v2 ✅ Aplikacje To-Do List 🎨 Kreatywne wolne strony 🎯 FSACB - Ostateczny pokaz 🌍 Benchmark tłumaczeń
Modele
🏆 Top 10 modeli 🆓 Darmowe modele 📋 Wszystkie modele ⚙️ Kilo Code
Zasoby
💬 Biblioteka promptów 📖 Słownik AI 🔗 Przydatne linki

Słownik AI

Kompletny słownik sztucznej inteligencji

162
kategorie
2 032
podkategorie
23 060
pojęcia
📖
pojęcia

Teacher-Student Architecture

Framework where a teacher model trains a student model by transferring its implicit knowledge through soft targets and regularizations.

📖
pojęcia

Feature Map Distillation

Knowledge transfer method at the level of intermediate model representations rather than at the final prediction level.

📖
pojęcia

Attention Transfer

Transfer of attention maps from the teacher to the student to preserve the important regions identified by the complex model.

📖
pojęcia

Relation Knowledge Distillation

Approach preserving structural relationships between training samples rather than individual knowledge.

📖
pojęcia

Self-Distillation

Process where a model self-improves by transferring its knowledge to a deeper or wider version of itself.

📖
pojęcia

Progressive Distillation

Iterative distillation method where the student gradually becomes the teacher for even more compact models.

📖
pojęcia

Online Knowledge Distillation

Approach where multiple models train each other in real-time without requiring a pre-trained teacher.

📖
pojęcia

Cross-Domain Distillation

Technique for transferring knowledge between models operating on different domains but sharing similar underlying structures.

📖
pojęcia

Lifelong Learning via Distillation

Application of distillation to preserve acquired knowledge during continuous learning and avoid catastrophic forgetting.

📖
pojęcia

Ensemble Distillation

Compression of an ensemble of models into a single compact model preserving the diversity of collective knowledge.

📖
pojęcia

Neural Architecture Search with Distillation

Integration of distillation into the NAS process to guide the search for efficient architectures preserving performance.

📖
pojęcia

Contrastive Knowledge Distillation

Approach using positive and negative contrasts to transfer discriminative representations from teacher to student.

🔍

Nie znaleziono wyników