Knowledge Distillation
Ensemble Distillation
Compression of an ensemble of models into a single compact model preserving the diversity of collective knowledge.
← IndietroCompression of an ensemble of models into a single compact model preserving the diversity of collective knowledge.
← Indietro