Knowledge Distillation
Ensemble Distillation
Compression of an ensemble of models into a single compact model preserving the diversity of collective knowledge.
← Quay lạiCompression of an ensemble of models into a single compact model preserving the diversity of collective knowledge.
← Quay lại