KI-Glossar
Das vollständige Wörterbuch der Künstlichen Intelligenz
Incremental Learning
Methods enabling models to learn from new samples without requiring a complete retraining from scratch.
Knowledge Distillation
Technique for transferring knowledge from an old model to a new model to preserve performance on previous tasks.
Elastic Weight Consolidation
Regularization approach identifying important weights for previous tasks and limiting their modification during new learning.
Episodic Memories
Selective storage systems of past examples used for relearning and prevention of catastrophic forgetting.
Génération de Replay
Création synthétique d'exemples représentatifs des tâches passées pour maintenir les performances sans stocker de vraies données.
Dynamic Architectures
Models capable of extending or modifying their network structure to integrate new knowledge without affecting previous ones.
Concept Drift Detection
Algorithms identifying changes in data distribution to continuously adapt learning models.
Continual Reinforcement Learning
Application of lifelong learning principles to reinforcement learning systems facing evolving environments.
Continual Meta-Learning
Approaches learning to efficiently learn new tasks while retaining previously acquired capabilities.
Lifelong Learning Evaluation
Specific protocols and metrics to measure the ability of systems to learn continuously without forgetting.
Stability-Plasticity Dilemma
Study of the optimal balance between the capacity to learn new information (plasticity) and the preservation of knowledge (stability).
Progressive Multi-Task Learning
Sequential management of multiple tasks where each new task is learned without degrading performance on previous tasks.
Neural Network Compaction
Efficient compression techniques for accumulated knowledge to maintain reasonably sized models.
Continual Transfer Learning
Adaptive methods for transferring knowledge between evolving domains while preserving prior skills.
Catastrophic Forgetting Regularization
Set of techniques designed to counter catastrophic forgetting during sequential learning of multiple tasks.