Knowledge Distillation
Self-Knowledge Distillation
Technique where a model self-distills by using its own knowledge at different training stages or different branches to improve its performance.
← Indietro