Glosarium AI
Kamus lengkap Kecerdasan Buatan
Task-agnostic Training
Training approach where models learn general representations without optimizing for specific tasks. This method promotes flexibility and transfer capabilities to new applications.
Cross-domain Transfer
Ability of a model to apply knowledge acquired in one domain to tasks in a completely different domain. This transferability is crucial for the success of zero-shot learning.
Foundation Models
Large-scale models pre-trained on massive and diverse data, serving as a foundation for multiple downstream applications. These models form the backbone of modern zero-shot learning.
Self-consistency
Inference method that generates multiple reasonings for the same problem and selects the most frequent answer. This approach improves the reliability of zero-shot responses by exploiting redundancy.
Model Capacity
Measure of the complexity and number of parameters a model can effectively use to store knowledge. Sufficient capacity is required for the emergence of zero-shot capabilities.
Task Adaptation
Process by which a pre-trained model dynamically adjusts to a new specific task during inference. This adaptation without retraining is at the heart of zero-shot learning.
Generalization Gap
Performance difference between tasks seen during training and completely new tasks. Reducing this gap is the fundamental objective of zero-shot learning.
Zero-shot Prompting
Technique consisting of providing a model with only a task description without any examples to guide its response. This method directly tests the model's generalization capabilities.