BERT (Bidirectional Encoder Representations)
Pre-training Objectives
Self-supervised tasks (MLM and NSP) used to pre-train BERT on large unlabeled corpora, enabling learning of general linguistic representations.
← Wstecz