🏠 Home
Benchmark
📊 Tutti i benchmark 🦖 Dinosauro v1 🦖 Dinosauro v2 ✅ App To-Do List 🎨 Pagine libere creative 🎯 FSACB - Ultimate Showcase 🌍 Benchmark traduzione
Modelli
🏆 Top 10 modelli 🆓 Modelli gratuiti 📋 Tutti i modelli ⚙️ Kilo Code
Risorse
💬 Libreria di prompt 📖 Glossario IA 🔗 Link utili

Glossario IA

Il dizionario completo dell'Intelligenza Artificiale

162
categorie
2.032
sottocategorie
23.060
termini
📖
termini

Progressive Networks

Neural network architecture that adds new columns for each new task while preserving knowledge from previous tasks. This approach enables continual learning without catastrophic forgetting.

📖
termini

Lateral Connections

Connections that link columns from previous tasks to new columns in the Progressive Networks architecture. They enable knowledge transfer and reuse of learned features.

📖
termini

Column-wise Architecture

Organizational structure where each task has its own distinct but interconnected neural column. This architecture facilitates model expansion for new tasks without disrupting existing knowledge.

📖
termini

Forward Transfer

Phenomenon where knowledge acquired from previous tasks improves performance on new tasks. This positive transfer is optimized by lateral connections in Progressive Networks.

📖
termini

Backward Transfer

Ability of a model to retroactively improve performance on previous tasks after learning new tasks. Progressive Networks facilitate this bidirectional knowledge transfer.

📖
termini

Knowledge Preservation

Mechanism that ensures knowledge acquired from previous tasks is not degraded when learning new tasks. This preservation is fundamental in continual learning.

📖
termini

Task-specific Adapters

Specialized modules inserted in the architecture to adapt general features to the specific requirements of each task. These adapters allow optimal flexibility while preserving shared knowledge.

📖
termini

Capacity Expansion

Strategy for dynamically increasing model capacity by adding new neural columns to accommodate new tasks. This controlled expansion avoids resource saturation.

📖
termini

Neural Plasticity

The ability of neural networks to adapt and modify their connections in response to new information. In Progressive Networks, this plasticity is controlled to preserve existing knowledge.

📖
termini

Synaptic Intelligence

Continuous learning method that identifies and protects synaptic connections important for previous tasks. This synaptic intelligence is integrated into knowledge preservation mechanisms.

📖
termini

Elastic Weight Consolidation

Regularization technique that penalizes significant changes in synaptic weights crucial for previous tasks. This elastic approach allows a compromise between learning and preservation.

📖
termini

Memory Aware Synapses

Approach that evaluates the importance of each synapse based on its contribution to past tasks to guide future learning. This mnemonic awareness optimizes knowledge transfer.

📖
termini

Gradient Episodic Memory

Mechanism for storing and retrieving examples from previous tasks to counter catastrophic forgetting. This episodic memory guides gradients when learning new tasks.

📖
termini

Dynamic Architecture Expansion

Process of automatically adding new neural resources when the model reaches its maximum capacity. This dynamic expansion is essential for continuous learning systems.

📖
termini

Multi-task Representation

Shared feature space capturing common information across multiple tasks simultaneously. These multi-task representations are optimized in the Progressive Networks architecture.

🔍

Nessun risultato trovato