Cross-Lingual Transfer
Cross-Lingual Knowledge Distillation
Technique where a large multilingual teacher model transfers its knowledge to a more compact student model, preserving cross-lingual capabilities while reducing computational complexity. This method enables effective deployment of multilingual models on limited resources.
← Zurück