BERT (Bidirectional Encoder Representations)
Positional Embeddings
Vectors added to token embeddings in BERT to encode sequential position, essential since attention alone does not capture token order.
← Kembali