Cross-Lingual Zero-Shot
Multilingual Transformer Models
Neural architectures pre-trained simultaneously on multiple languages, capable of performing zero-shot transfers between language pairs not seen during training.
← Zurück