🏠 Startseite
Vergleiche
📊 Alle Benchmarks 🦖 Dinosaurier v1 🦖 Dinosaurier v2 ✅ To-Do-Listen-Apps 🎨 Kreative freie Seiten 🎯 FSACB - Ultimatives Showcase 🌍 Übersetzungs-Benchmark
Modelle
🏆 Top 10 Modelle 🆓 Kostenlose Modelle 📋 Alle Modelle ⚙️ Kilo Code
Ressourcen
💬 Prompt-Bibliothek 📖 KI-Glossar 🔗 Nützliche Links

KI-Glossar

Das vollständige Wörterbuch der Künstlichen Intelligenz

162
Kategorien
2.032
Unterkategorien
23.060
Begriffe
📖
Begriffe

Progressive Networks

Neural network architecture that adds new columns for each new task while preserving knowledge from previous tasks. This approach enables continual learning without catastrophic forgetting.

📖
Begriffe

Lateral Connections

Connections that link columns from previous tasks to new columns in the Progressive Networks architecture. They enable knowledge transfer and reuse of learned features.

📖
Begriffe

Column-wise Architecture

Organizational structure where each task has its own distinct but interconnected neural column. This architecture facilitates model expansion for new tasks without disrupting existing knowledge.

📖
Begriffe

Forward Transfer

Phenomenon where knowledge acquired from previous tasks improves performance on new tasks. This positive transfer is optimized by lateral connections in Progressive Networks.

📖
Begriffe

Backward Transfer

Ability of a model to retroactively improve performance on previous tasks after learning new tasks. Progressive Networks facilitate this bidirectional knowledge transfer.

📖
Begriffe

Knowledge Preservation

Mechanism that ensures knowledge acquired from previous tasks is not degraded when learning new tasks. This preservation is fundamental in continual learning.

📖
Begriffe

Task-specific Adapters

Specialized modules inserted in the architecture to adapt general features to the specific requirements of each task. These adapters allow optimal flexibility while preserving shared knowledge.

📖
Begriffe

Capacity Expansion

Strategy for dynamically increasing model capacity by adding new neural columns to accommodate new tasks. This controlled expansion avoids resource saturation.

📖
Begriffe

Neural Plasticity

The ability of neural networks to adapt and modify their connections in response to new information. In Progressive Networks, this plasticity is controlled to preserve existing knowledge.

📖
Begriffe

Synaptic Intelligence

Continuous learning method that identifies and protects synaptic connections important for previous tasks. This synaptic intelligence is integrated into knowledge preservation mechanisms.

📖
Begriffe

Elastic Weight Consolidation

Regularization technique that penalizes significant changes in synaptic weights crucial for previous tasks. This elastic approach allows a compromise between learning and preservation.

📖
Begriffe

Memory Aware Synapses

Approach that evaluates the importance of each synapse based on its contribution to past tasks to guide future learning. This mnemonic awareness optimizes knowledge transfer.

📖
Begriffe

Gradient Episodic Memory

Mechanism for storing and retrieving examples from previous tasks to counter catastrophic forgetting. This episodic memory guides gradients when learning new tasks.

📖
Begriffe

Dynamic Architecture Expansion

Process of automatically adding new neural resources when the model reaches its maximum capacity. This dynamic expansion is essential for continuous learning systems.

📖
Begriffe

Multi-task Representation

Shared feature space capturing common information across multiple tasks simultaneously. These multi-task representations are optimized in the Progressive Networks architecture.

🔍

Keine Ergebnisse gefunden