🏠 Hem
Benchmarkar
📊 Alla benchmarkar 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List-applikationer 🎨 Kreativa fria sidor 🎯 FSACB - Ultimata uppvisningen 🌍 Översättningsbenchmark
Modeller
🏆 Topp 10 modeller 🆓 Gratis modeller 📋 Alla modeller ⚙️ Kilo Code
Resurser
💬 Promptbibliotek 📖 AI-ordlista 🔗 Användbara länkar

AI-ordlista

Den kompletta ordlistan över AI

162
kategorier
2 032
underkategorier
23 060
termer
📖
termer

Attention Weights Visualization

Graphical technique representing the numerical attention values between tokens in a sequence, using color or size intensities to quantify importance relationships.

📖
termer

Heat Maps

Two-dimensional matrix representation where colors encode the intensity of attention weights, allowing quick identification of areas of high attentional concentration.

📖
termer

Attention Heads Analysis

Comparative study of individual attention patterns in each head of the multi-head mechanism, revealing functional specializations and redundancies between heads.

📖
termer

Multi-Head Attention Patterns

Simultaneous visualization of different attention mechanisms in a Transformer layer, showing how each head captures distinct types of syntactic or semantic relationships.

📖
termer

Self-Attention Matrix

Square matrix representing attention weights between all pairs of tokens in the same sequence, where each element (i,j) indicates the influence of token j on token i.

📖
termer

Cross-Attention Visualization

Graphical representation of attention weights between two different sequences, typically used in encoder-decoder models to visualize source-target alignments.

📖
termer

Attention Rollout

Recursive propagation method of attention weights through successive layers to calculate the cumulative influence of a token on the model's final predictions.

📖
termer

Attention Flow

Visualization technique showing how information flows through Transformer layers by tracing attentional influence paths between tokens.

📖
termer

Gradient-Based Attention

Approach using gradients of the output with respect to attention weights to identify the most relevant contributions to the model's prediction.

📖
termer

Token-to-Token Attention

Direct visualization of pairwise attention relationships between tokens, allowing identification of local and global dependencies in the input sequence.

📖
termer

Layer-wise Attention Analysis

Comparative examination of attention patterns across different network depths, revealing the evolution of abstract representations from lower to upper layers.

📖
termer

Attention Trajectory

Temporal visualization of the evolution of attention weights during inference or training, showing how patterns stabilize or change dynamically.

📖
termer

Attention Saliency Maps

Heatmaps overlaid on the input text to highlight tokens receiving the most attention, facilitating interpretation of the model's decisions.

📖
termer

Attention Propagation

Technique tracing how attention signals propagate and amplify through the network, revealing critical paths for decision-making.

📖
termer

Attention Projection

Dimensional reduction of high-dimensional attention weights to visualizable 2D/3D spaces, using techniques like t-SNE or UMAP to identify clusters of similar patterns.

📖
termer

Attention Clustering

Automatic grouping of similar attention patterns to identify recurring behaviors or functional specializations in attention mechanisms.

📖
termer

Attention Pattern Classification

Automatic categorization of types of attention patterns (syntactic, semantic, positional) based on their structural and distributional characteristics.

🔍

Inga resultat hittades