Knowledge Distillation
Online Knowledge Distillation
Approach where multiple models train each other in real-time without requiring a pre-trained teacher.
← KembaliApproach where multiple models train each other in real-time without requiring a pre-trained teacher.
← Kembali