🏠 홈
벤치마크
📊 모든 벤치마크 🦖 공룡 v1 🦖 공룡 v2 ✅ 할 일 목록 앱 🎨 창의적인 자유 페이지 🎯 FSACB - 궁극의 쇼케이스 🌍 번역 벤치마크
모델
🏆 톱 10 모델 🆓 무료 모델 📋 모든 모델 ⚙️ 킬로 코드 모드
리소스
💬 프롬프트 라이브러리 📖 AI 용어 사전 🔗 유용한 링크

AI 용어집

인공지능 완전 사전

162
카테고리
2,032
하위 카테고리
23,060
용어
📖
용어

Progressive Networks

Neural network architecture that adds new columns for each new task while preserving knowledge from previous tasks. This approach enables continual learning without catastrophic forgetting.

📖
용어

Lateral Connections

Connections that link columns from previous tasks to new columns in the Progressive Networks architecture. They enable knowledge transfer and reuse of learned features.

📖
용어

Column-wise Architecture

Organizational structure where each task has its own distinct but interconnected neural column. This architecture facilitates model expansion for new tasks without disrupting existing knowledge.

📖
용어

Forward Transfer

Phenomenon where knowledge acquired from previous tasks improves performance on new tasks. This positive transfer is optimized by lateral connections in Progressive Networks.

📖
용어

Backward Transfer

Ability of a model to retroactively improve performance on previous tasks after learning new tasks. Progressive Networks facilitate this bidirectional knowledge transfer.

📖
용어

Knowledge Preservation

Mechanism that ensures knowledge acquired from previous tasks is not degraded when learning new tasks. This preservation is fundamental in continual learning.

📖
용어

Task-specific Adapters

Specialized modules inserted in the architecture to adapt general features to the specific requirements of each task. These adapters allow optimal flexibility while preserving shared knowledge.

📖
용어

Capacity Expansion

Strategy for dynamically increasing model capacity by adding new neural columns to accommodate new tasks. This controlled expansion avoids resource saturation.

📖
용어

Neural Plasticity

The ability of neural networks to adapt and modify their connections in response to new information. In Progressive Networks, this plasticity is controlled to preserve existing knowledge.

📖
용어

Synaptic Intelligence

Continuous learning method that identifies and protects synaptic connections important for previous tasks. This synaptic intelligence is integrated into knowledge preservation mechanisms.

📖
용어

Elastic Weight Consolidation

Regularization technique that penalizes significant changes in synaptic weights crucial for previous tasks. This elastic approach allows a compromise between learning and preservation.

📖
용어

Memory Aware Synapses

Approach that evaluates the importance of each synapse based on its contribution to past tasks to guide future learning. This mnemonic awareness optimizes knowledge transfer.

📖
용어

Gradient Episodic Memory

Mechanism for storing and retrieving examples from previous tasks to counter catastrophic forgetting. This episodic memory guides gradients when learning new tasks.

📖
용어

Dynamic Architecture Expansion

Process of automatically adding new neural resources when the model reaches its maximum capacity. This dynamic expansion is essential for continuous learning systems.

📖
용어

Multi-task Representation

Shared feature space capturing common information across multiple tasks simultaneously. These multi-task representations are optimized in the Progressive Networks architecture.

🔍

결과를 찾을 수 없습니다