🏠 홈
벤치마크
📊 모든 벤치마크 🦖 공룡 v1 🦖 공룡 v2 ✅ 할 일 목록 앱 🎨 창의적인 자유 페이지 🎯 FSACB - 궁극의 쇼케이스 🌍 번역 벤치마크
모델
🏆 톱 10 모델 🆓 무료 모델 📋 모든 모델 ⚙️ 킬로 코드 모드
리소스
💬 프롬프트 라이브러리 📖 AI 용어 사전 🔗 유용한 링크

AI 용어집

인공지능 완전 사전

162
카테고리
2,032
하위 카테고리
23,060
용어
📖
용어

Token Alignment

Process by which cross-attention learns to automatically align significant tokens or segments between two sequences of different lengths or structures. Crucial for translation tasks where correspondences are not bijective.

📖
용어

Sparse Cross-Attention

Optimization of cross-attention limiting attentional connections to predefined or learned subsets of relevant positions. Reduces computational complexity from O(n²) to O(n log n) or O(n) for long sequences.

📖
용어

Cross-Attention with Relative Position

Extension of cross-attention incorporating relative position information between elements of the two sequences rather than absolute positions. Improves generalization to sequence lengths not seen during training.

📖
용어

Adaptive Cross-Attention

Attention mechanism dynamically adjusting its focus based on context or the model's internal state. Enables flexible allocation of attentional resources according to the complexity or importance of inter-sequence regions.

📖
용어

Cross-Attention Pooling

Aggregation technique using cross-attention to selectively weight and combine features from a target sequence based on a query sequence. Generates globally informed contextual representations for classification or regression.

📖
용어

Bilateral Cross-Attention

Symmetric architecture applying cross-attention in both directions between two sequences, enabling complete bidirectional interaction. Used in tasks requiring mutual alignment such as paraphrasing or semantic matching.

📖
용어

Cross-Attention Regularization

Constraint techniques applied to cross-attention weights to encourage desirable properties such as sparsity, diversity, or temporal coherence. Improves model interpretability and generalization.

📖
용어

Memory-Augmented Cross-Attention

Extension of cross-attention integrating external or persistent memory accessible via attention mechanisms. Allows storing and retrieving information beyond the immediate context window for long-range tasks.

🔍

결과를 찾을 수 없습니다