KI-Glossar
Das vollständige Wörterbuch der Künstlichen Intelligenz
Replay Memory
Buffer memory storing past data examples for periodic model retraining to counter catastrophic forgetting. This technique maintains a balanced representation of historical knowledge while integrating new information.
Knowledge Distillation
Technique where a learning model imitates the outputs of a previous expert model to retain acquired knowledge. This approach uses the soft probability distributions of the teacher model as guidance to preserve learned relationships between classes.
Progressive Neural Networks
Architecture where each new task has its own neural column while maintaining lateral connections with previous columns. This structure avoids direct interference between tasks while enabling knowledge transfer.
Task-Incremental Learning
Learning scenario where tasks are presented sequentially with clearly defined task boundaries. The model must perform on all learned tasks without simultaneous access to data from all tasks.
Class-Incremental Learning
Paradigm where new classes are progressively added to the classification system without re-exposing old classes. This approach simulates real-world conditions where categories evolve dynamically over time.
Domain Adaptation
Process of adjusting a model trained on a source domain to perform effectively on a different target domain. Incremental adaptation allows the model to progressively adapt to changes in data distribution.
Curriculum Learning
Sequential strategy presenting training samples in order of increasing difficulty to optimize convergence. Adaptive curriculum dynamically adjusts the learning sequence according to the model's progress.
Synaptic Regularization
Technique protecting important neural connections for past tasks when learning new information. This approach identifies critical synaptic weights and applies penalties to preserve their contribution to previous performance.