🏠 Hem
Benchmarkar
📊 Alla benchmarkar 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List-applikationer 🎨 Kreativa fria sidor 🎯 FSACB - Ultimata uppvisningen 🌍 Översättningsbenchmark
Modeller
🏆 Topp 10 modeller 🆓 Gratis modeller 📋 Alla modeller ⚙️ Kilo Code
Resurser
💬 Promptbibliotek 📖 AI-ordlista 🔗 Användbara länkar

AI-ordlista

Den kompletta ordlistan över AI

162
kategorier
2 032
underkategorier
23 060
termer
📂
underkategorier

Attention Mechanism

Mathematical foundation allowing models to weight the relative importance of elements in a data sequence.

5 termer
📂
underkategorier

Self-Attention

Mechanism where each element of a sequence computes its attention relative to all other elements in the same sequence.

0 termer
📂
underkategorier

Multi-Head Attention

Attention extension using multiple attention heads in parallel to capture different types of relationships.

3 termer
📂
underkategorier

Positional Encoding

Technique for incorporating the sequential position of elements into embeddings without using an RNN.

12 termer
📂
underkategorier

Encoder-Decoder Architecture

Fundamental structure of Transformers separating input processing (encoder) and output generation (decoder).

2 termer
📂
underkategorier

Attention Scaling

Square root of dimensionality normalization to stabilize training and prevent exploding gradients.

14 termer
📂
underkategorier

Cross-Attention

Attention mechanism between two different sequences, used in translation and multimodal tasks.

8 termer
📂
underkategorier

Sparse Attention

Variant of attention computed only on a subset of positions to reduce computational complexity.

3 termer
📂
underkategorier

Attention Masks

Control mechanisms allowing to mask certain positions during attention computation to prevent information leakage.

9 termer
📂
underkategorier

Vision Transformers

Adaptation of the Transformer architecture to computer vision tasks by treating images as sequences of patches.

9 termer
📂
underkategorier

Efficient Attention

Set of optimizations aimed at reducing the quadratic complexity of standard attention for longer sequences.

2 termer
📂
underkategorier

Hierarchical Attention

Multi-level attention structure capturing relationships at different hierarchical scales in the data.

12 termer
🔍

Inga resultat hittades