Cross-Lingual Transfer
Multilingual Transformer
Architecture based on attention mechanisms specifically designed to efficiently process multiple languages in a unified model, using shared parameters and multilingual embeddings. These transformers form the basis of modern cross-lingual transfer models.
← Back