Knowledge Distillation
Zero-Shot Knowledge Distillation
Approach that allows distilling knowledge from a teacher without requiring training data, using only the pre-trained model weights.
← Indietro