🏠 Hem
Benchmarkar
📊 Alla benchmarkar 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List-applikationer 🎨 Kreativa fria sidor 🎯 FSACB - Ultimata uppvisningen 🌍 Översättningsbenchmark
Modeller
🏆 Topp 10 modeller 🆓 Gratis modeller 📋 Alla modeller ⚙️ Kilo Code
Resurser
💬 Promptbibliotek 📖 AI-ordlista 🔗 Användbara länkar

AI-ordlista

Den kompletta ordlistan över AI

162
kategorier
2 032
underkategorier
23 060
termer
📖
termer

Bidirectional Encoder

Component that processes the entire input sequence simultaneously, allowing each token to attend to all other tokens, both past and future, for complete contextual understanding.

📖
termer

Autoregressive Decoder

Generation mechanism where the decoder produces the output sequence token by token, based solely on previously generated tokens and the encoder's contextual representation.

📖
termer

Cross-Attention Mechanism

Process in the decoder that allows it to focus on specific parts of the encoder's output, weighting the importance of each input token for generating the current output token.

📖
termer

Causal Masking

Technique applied in the decoder to prevent each position from attending to future positions, thus ensuring the autoregressive nature of generation and preventing information leakage.

📖
termer

Feed-Forward Network

Fully connected neural network applied to each position independently after the attention mechanism, enabling nonlinear transformation and higher-dimensional projection.

📖
termer

Layer Normalization

Regularization technique that stabilizes activations by normalizing features for each individual example, accelerating convergence and improving overall model performance.

📖
termer

Encoder Bottleneck

Fixed-dimensional vector representation, often the final output of the encoder, that condenses all information from the input sequence and serves as a unique context for the decoder during generation.

📖
termer

Token Embeddings

High-dimensional dense vectors that represent each discrete token from the vocabulary in a continuous space, capturing semantic and syntactic information learned during training.

🔍

Inga resultat hittades