Continual Meta-Learning
Meta-Knowledge Distillation
Process of compressing meta-knowledge acquired during continuous learning into a more compact model without performance loss.
← IndietroProcess of compressing meta-knowledge acquired during continuous learning into a more compact model without performance loss.
← Indietro