🏠 Home
Benchmark Hub
📊 All Benchmarks 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List Applications 🎨 Creative Free Pages 🎯 FSACB - Ultimate Showcase 🌍 Translation Benchmark
Models
🏆 Top 10 Models 🆓 Free Models 📋 All Models ⚙️ Kilo Code
Resources
💬 Prompts Library 📖 AI Glossary 🔗 Useful Links

AI Glossary

The complete dictionary of Artificial Intelligence

162
categories
2,032
subcategories
23,060
terms
📖
terms

Progressive Networks

Neural network architecture that adds new columns for each new task while preserving knowledge from previous tasks. This approach enables continual learning without catastrophic forgetting.

📖
terms

Lateral Connections

Connections that link columns from previous tasks to new columns in the Progressive Networks architecture. They enable knowledge transfer and reuse of learned features.

📖
terms

Column-wise Architecture

Organizational structure where each task has its own distinct but interconnected neural column. This architecture facilitates model expansion for new tasks without disrupting existing knowledge.

📖
terms

Forward Transfer

Phenomenon where knowledge acquired from previous tasks improves performance on new tasks. This positive transfer is optimized by lateral connections in Progressive Networks.

📖
terms

Backward Transfer

Ability of a model to retroactively improve performance on previous tasks after learning new tasks. Progressive Networks facilitate this bidirectional knowledge transfer.

📖
terms

Knowledge Preservation

Mechanism that ensures knowledge acquired from previous tasks is not degraded when learning new tasks. This preservation is fundamental in continual learning.

📖
terms

Task-specific Adapters

Specialized modules inserted in the architecture to adapt general features to the specific requirements of each task. These adapters allow optimal flexibility while preserving shared knowledge.

📖
terms

Capacity Expansion

Strategy for dynamically increasing model capacity by adding new neural columns to accommodate new tasks. This controlled expansion avoids resource saturation.

📖
terms

Neural Plasticity

The ability of neural networks to adapt and modify their connections in response to new information. In Progressive Networks, this plasticity is controlled to preserve existing knowledge.

📖
terms

Synaptic Intelligence

Continuous learning method that identifies and protects synaptic connections important for previous tasks. This synaptic intelligence is integrated into knowledge preservation mechanisms.

📖
terms

Elastic Weight Consolidation

Regularization technique that penalizes significant changes in synaptic weights crucial for previous tasks. This elastic approach allows a compromise between learning and preservation.

📖
terms

Memory Aware Synapses

Approach that evaluates the importance of each synapse based on its contribution to past tasks to guide future learning. This mnemonic awareness optimizes knowledge transfer.

📖
terms

Gradient Episodic Memory

Mechanism for storing and retrieving examples from previous tasks to counter catastrophic forgetting. This episodic memory guides gradients when learning new tasks.

📖
terms

Dynamic Architecture Expansion

Process of automatically adding new neural resources when the model reaches its maximum capacity. This dynamic expansion is essential for continuous learning systems.

📖
terms

Multi-task Representation

Shared feature space capturing common information across multiple tasks simultaneously. These multi-task representations are optimized in the Progressive Networks architecture.

🔍

No results found