🏠 Ana Sayfa
Benchmarklar
📊 Tüm Benchmarklar 🦖 Dinozor v1 🦖 Dinozor v2 ✅ To-Do List Uygulamaları 🎨 Yaratıcı Serbest Sayfalar 🎯 FSACB - Nihai Gösteri 🌍 Çeviri Benchmarkı
Modeller
🏆 En İyi 10 Model 🆓 Ücretsiz Modeller 📋 Tüm Modeller ⚙️ Kilo Code
Kaynaklar
💬 Prompt Kütüphanesi 📖 YZ Sözlüğü 🔗 Faydalı Bağlantılar

YZ Sözlüğü

Yapay Zekanın tam sözlüğü

162
kategoriler
2.032
alt kategoriler
23.060
terimler
📂
alt kategoriler

Attention Mechanism

Mathematical foundation allowing models to weight the relative importance of elements in a data sequence.

5 terimler
📂
alt kategoriler

Self-Attention

Mechanism where each element of a sequence computes its attention relative to all other elements in the same sequence.

0 terimler
📂
alt kategoriler

Multi-Head Attention

Attention extension using multiple attention heads in parallel to capture different types of relationships.

3 terimler
📂
alt kategoriler

Positional Encoding

Technique for incorporating the sequential position of elements into embeddings without using an RNN.

12 terimler
📂
alt kategoriler

Encoder-Decoder Architecture

Fundamental structure of Transformers separating input processing (encoder) and output generation (decoder).

2 terimler
📂
alt kategoriler

Attention Scaling

Square root of dimensionality normalization to stabilize training and prevent exploding gradients.

14 terimler
📂
alt kategoriler

Cross-Attention

Attention mechanism between two different sequences, used in translation and multimodal tasks.

8 terimler
📂
alt kategoriler

Sparse Attention

Variant of attention computed only on a subset of positions to reduce computational complexity.

3 terimler
📂
alt kategoriler

Attention Masks

Control mechanisms allowing to mask certain positions during attention computation to prevent information leakage.

9 terimler
📂
alt kategoriler

Vision Transformers

Adaptation of the Transformer architecture to computer vision tasks by treating images as sequences of patches.

9 terimler
📂
alt kategoriler

Efficient Attention

Set of optimizations aimed at reducing the quadratic complexity of standard attention for longer sequences.

2 terimler
📂
alt kategoriler

Hierarchical Attention

Multi-level attention structure capturing relationships at different hierarchical scales in the data.

12 terimler
🔍

Sonuç bulunamadı