YZ Sözlüğü
Yapay Zekanın tam sözlüğü
BERT Embeddings
Contextual representations generated by BERT that capture the meaning of words based on their specific context in the sentence.
Segment Embedding
Vector representations used in BERT to distinguish different text segments (e.g., sentences A and B in comprehension tasks).
Static Embedding
Fixed vector representations for each word regardless of context, generated by models like Word2Vec or GloVe.
Sentence Embedding
Single vector representation that captures the overall meaning of an entire sentence, used for semantic similarity and classification tasks.
Document Embedding
Vectorization of entire documents that preserves semantic relationships between different documents in a continuous vector space.
Vector Space Model
Algebraic model that represents textual objects as vectors in a multidimensional space to facilitate similarity calculations.
Fine-tuning Embeddings
Process of adapting pre-trained embeddings to a specific domain or task by continuing training on targeted data.
Embedding Layer
Neural network layer that transforms discrete token indices into continuous dense vectors, serving as the first layer in most NLP models.