Knowledge Distillation
Cross-Domain Distillation
Technique for transferring knowledge between models operating on different domains but sharing similar underlying structures.
← IndietroTechnique for transferring knowledge between models operating on different domains but sharing similar underlying structures.
← Indietro