Glossario IA
Il dizionario completo dell'Intelligenza Artificiale
Metric embedding space
Low-dimensional vector space where distances between points reflect semantic similarities between classes, optimized for few-shot classification tasks.
Class prototype
Central vector representation of a class, calculated as the average of embeddings of support examples of that class in the metric space.
Support set
Subset of training data used to calculate class prototypes during a meta-learning episode, containing a limited number of examples per class.
Query set
Set of examples to be classified used to evaluate and update model performance during training, without participating in the calculation of prototypes.
N-way K-shot
Few-shot learning paradigm where N represents the number of classes to discriminate and K the number of available examples per class in the support set.
Euclidean distance in embedding
Similarity measure used to classify query examples by calculating their distance to class prototypes in the learned embedding space.
Softmax on distances
Activation function transforming negative distances to prototypes into probability distribution over classes, used for classifying query examples.
Meta-learning episode
Training unit simulating a complete few-shot task, including the construction of prototypes from the support set and classification of the query set.
Encoding function
Parameterized neural network that transforms raw inputs into embedding vectors, optimized to minimize intra-class distances and maximize inter-class distances.
Distance-based cross-entropy loss
Objective function that minimizes the divergence between the predicted distribution based on distances to prototypes and the true labels of query examples.
Prototype initialization
Process of initial computation of class representations before training, often performed by random averaging or pre-training on more abundant data.
Zero-shot transfer
Ability of Prototypical Networks to generalize to unseen classes during training using only descriptions or semantic attributes.
Online prototype updates
Dynamic adaptation of class representations during inference by gradually incorporating new examples to refine decision boundaries.
Weighted example aggregation
Improved variant of prototype computation using weights for each support example based on their quality or relevance to the class representation.
Metric latent space
Learned intermediate representation where the geometric structure preserves similarity relationships between classes, facilitating linear separation in the prototype space.
Prototype regularization
Technique that prevents overfitting by constraining prototypes to remain in specific regions of the embedding space or by penalizing their excessive dispersion.
Episode-based learning
Training strategy where each batch contains several independent few-shot episodes, allowing the model to learn to quickly adapt to new tasks.
Adaptive Mahalanobis Distance
Extension of Prototypical Networks using a learned distance metric that takes into account the covariance of data in each class for better separation.
Dynamic Prototype
Variant where prototypes are not fixed but adapt based on the context or specific characteristics of each query example to be classified.