🏠 Trang chủ
Benchmark
📊 Tất cả benchmark 🦖 Khủng long v1 🦖 Khủng long v2 ✅ Ứng dụng To-Do List 🎨 Trang tự do sáng tạo 🎯 FSACB - Trình diễn cuối cùng 🌍 Benchmark dịch thuật
Mô hình
🏆 Top 10 mô hình 🆓 Mô hình miễn phí 📋 Tất cả mô hình ⚙️ Kilo Code
Tài nguyên
💬 Thư viện prompt 📖 Thuật ngữ AI 🔗 Liên kết hữu ích

Thuật ngữ AI

Từ điển đầy đủ về Trí tuệ nhân tạo

162
danh mục
2.032
danh mục con
23.060
thuật ngữ
📖
thuật ngữ

Auto-regression

Generation process where each token is predicted sequentially based on all previous tokens, enabling progressive and coherent text construction.

📖
thuật ngữ

Decoder-Only Architecture

Transformer model structure that eliminates encoders to focus solely on the decoder, optimized for text generation using masked attention to prevent future information leakage.

📖
thuật ngữ

Multi-Head Attention Mechanism

Technique allowing the model to simultaneously focus on different positions in the input sequence through multiple independent attention heads, capturing various types of dependencies.

📖
thuật ngữ

BPE Tokenization

Byte-Pair Encoding algorithm that segments text into optimal subwords, balancing vocabulary size and semantic coverage for efficient natural language processing.

📖
thuật ngữ

Causal Attention Mask

Binary matrix applied during attention to prevent each position from attending to future positions, thus preserving the causal nature of text generation.

📖
thuật ngữ

Model Parameters

Trainable weights of the neural network, whose number characterizes the model's capacity, ranging from millions to billions depending on the desired complexity and performance.

📖
thuật ngữ

Temperature Sampling

Parameter controlling the degree of randomness in generation, where high values increase diversity and low values favor safer and more coherent predictions.

📖
thuật ngữ

Context Window

Maximum number of tokens the model can consider simultaneously during generation, determining its ability to maintain coherence over long texts.

🔍

Không tìm thấy kết quả