Knowledge Distillation
Online Knowledge Distillation
Approach where multiple models train each other in real-time without requiring a pre-trained teacher.
← WsteczApproach where multiple models train each other in real-time without requiring a pre-trained teacher.
← Wstecz