Knowledge Distillation
Self-Distillation
Process where a model self-improves by transferring its knowledge to a deeper or wider version of itself.
← Quay lạiProcess where a model self-improves by transferring its knowledge to a deeper or wider version of itself.
← Quay lại