Knowledge Distillation
Ensemble Distillation
Compression of an ensemble of models into a single compact model preserving the diversity of collective knowledge.
← ZurückCompression of an ensemble of models into a single compact model preserving the diversity of collective knowledge.
← Zurück