🏠 홈
벤치마크
📊 모든 벤치마크 🦖 공룡 v1 🦖 공룡 v2 ✅ 할 일 목록 앱 🎨 창의적인 자유 페이지 🎯 FSACB - 궁극의 쇼케이스 🌍 번역 벤치마크
모델
🏆 톱 10 모델 🆓 무료 모델 📋 모든 모델 ⚙️ 킬로 코드 모드
리소스
💬 프롬프트 라이브러리 📖 AI 용어 사전 🔗 유용한 링크

AI 용어집

인공지능 완전 사전

162
카테고리
2,032
하위 카테고리
23,060
용어
📖
용어

Bidirectional Encoder

Component that processes the entire input sequence simultaneously, allowing each token to attend to all other tokens, both past and future, for complete contextual understanding.

📖
용어

Autoregressive Decoder

Generation mechanism where the decoder produces the output sequence token by token, based solely on previously generated tokens and the encoder's contextual representation.

📖
용어

Cross-Attention Mechanism

Process in the decoder that allows it to focus on specific parts of the encoder's output, weighting the importance of each input token for generating the current output token.

📖
용어

Causal Masking

Technique applied in the decoder to prevent each position from attending to future positions, thus ensuring the autoregressive nature of generation and preventing information leakage.

📖
용어

Feed-Forward Network

Fully connected neural network applied to each position independently after the attention mechanism, enabling nonlinear transformation and higher-dimensional projection.

📖
용어

Layer Normalization

Regularization technique that stabilizes activations by normalizing features for each individual example, accelerating convergence and improving overall model performance.

📖
용어

Encoder Bottleneck

Fixed-dimensional vector representation, often the final output of the encoder, that condenses all information from the input sequence and serves as a unique context for the decoder during generation.

📖
용어

Token Embeddings

High-dimensional dense vectors that represent each discrete token from the vocabulary in a continuous space, capturing semantic and syntactic information learned during training.

🔍

결과를 찾을 수 없습니다