Transfer Learning for NLP
Masked Language Modeling
Pre-training task where the model predicts randomly masked words in a sequence, forcing the learning of deep contextual representations.
← Zurück