🏠 Hem
Benchmarkar
📊 Alla benchmarkar 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List-applikationer 🎨 Kreativa fria sidor 🎯 FSACB - Ultimata uppvisningen 🌍 Översättningsbenchmark
Modeller
🏆 Topp 10 modeller 🆓 Gratis modeller 📋 Alla modeller ⚙️ Kilo Code
Resurser
💬 Promptbibliotek 📖 AI-ordlista 🔗 Användbara länkar

AI-ordlista

Den kompletta ordlistan över AI

162
kategorier
2 032
underkategorier
23 060
termer
📖
termer

Metric embedding space

Low-dimensional vector space where distances between points reflect semantic similarities between classes, optimized for few-shot classification tasks.

📖
termer

Class prototype

Central vector representation of a class, calculated as the average of embeddings of support examples of that class in the metric space.

📖
termer

Support set

Subset of training data used to calculate class prototypes during a meta-learning episode, containing a limited number of examples per class.

📖
termer

Query set

Set of examples to be classified used to evaluate and update model performance during training, without participating in the calculation of prototypes.

📖
termer

N-way K-shot

Few-shot learning paradigm where N represents the number of classes to discriminate and K the number of available examples per class in the support set.

📖
termer

Euclidean distance in embedding

Similarity measure used to classify query examples by calculating their distance to class prototypes in the learned embedding space.

📖
termer

Softmax on distances

Activation function transforming negative distances to prototypes into probability distribution over classes, used for classifying query examples.

📖
termer

Meta-learning episode

Training unit simulating a complete few-shot task, including the construction of prototypes from the support set and classification of the query set.

📖
termer

Encoding function

Parameterized neural network that transforms raw inputs into embedding vectors, optimized to minimize intra-class distances and maximize inter-class distances.

📖
termer

Distance-based cross-entropy loss

Objective function that minimizes the divergence between the predicted distribution based on distances to prototypes and the true labels of query examples.

📖
termer

Prototype initialization

Process of initial computation of class representations before training, often performed by random averaging or pre-training on more abundant data.

📖
termer

Zero-shot transfer

Ability of Prototypical Networks to generalize to unseen classes during training using only descriptions or semantic attributes.

📖
termer

Online prototype updates

Dynamic adaptation of class representations during inference by gradually incorporating new examples to refine decision boundaries.

📖
termer

Weighted example aggregation

Improved variant of prototype computation using weights for each support example based on their quality or relevance to the class representation.

📖
termer

Metric latent space

Learned intermediate representation where the geometric structure preserves similarity relationships between classes, facilitating linear separation in the prototype space.

📖
termer

Prototype regularization

Technique that prevents overfitting by constraining prototypes to remain in specific regions of the embedding space or by penalizing their excessive dispersion.

📖
termer

Episode-based learning

Training strategy where each batch contains several independent few-shot episodes, allowing the model to learn to quickly adapt to new tasks.

📖
termer

Adaptive Mahalanobis Distance

Extension of Prototypical Networks using a learned distance metric that takes into account the covariance of data in each class for better separation.

📖
termer

Dynamic Prototype

Variant where prototypes are not fixed but adapt based on the context or specific characteristics of each query example to be classified.

🔍

Inga resultat hittades