Glossario IA
Il dizionario completo dell'Intelligenza Artificiale
Distributional Semantics
Linguistic theory that postulates that the meaning of a word can be inferred from its context of appearance, forming the mathematical basis of modern vector representations.
Word Embedding
Technique of dense vector representation of words in a multidimensional space where semantically similar words are positioned close to each other.
Transformer Model
Neural network architecture based on attention mechanisms that allows processing textual data sequentially without depending on traditional recurrence.
Sentiment Analysis
Computational process of identifying and extracting opinions, emotions, and subjective attitudes expressed in unstructured text.
Named Entity Recognition (NER)
NLP task consisting of identifying and classifying named entities (people, organizations, places, dates) in unstructured texts.
Language Models
Probabilistic systems that calculate the probability of a sequence of words and can generate coherent text by predicting the next word.
Semantic Similarity
Quantitative measure of the degree of meaning resemblance between two linguistic units, based on their vector representations in the semantic space.
Latent Semantic Analysis (LSA)
Dimensionality reduction method applied to the document-term matrix to discover latent semantic structures in a text corpus.
Word2Vec
Unsupervised learning algorithm that creates vector representations of words by predicting the context (CBOW) or predicting a word from its context (Skip-gram).
GloVe
Vector representation learning algorithm that combines the advantages of the global co-occurrence matrix and local contextual predictions.
Dependency parsing
Process of identifying the grammatical and syntactic relationships between words in a sentence, represented as a dependency graph.
Relation extraction
NLP task consisting of identifying and classifying the semantic relationships between entities mentioned in unstructured text.
Word sense disambiguation
Process of automatically determining the correct meaning of a polysemous word based on its context of use in a text.
Seq2seq models
Neural network architecture that transforms an input sequence into an output sequence, widely used for machine translation and text generation.