🏠 Strona Główna
Benchmarki
📊 Wszystkie benchmarki 🦖 Dinozaur v1 🦖 Dinozaur v2 ✅ Aplikacje To-Do List 🎨 Kreatywne wolne strony 🎯 FSACB - Ostateczny pokaz 🌍 Benchmark tłumaczeń
Modele
🏆 Top 10 modeli 🆓 Darmowe modele 📋 Wszystkie modele ⚙️ Kilo Code
Zasoby
💬 Biblioteka promptów 📖 Słownik AI 🔗 Przydatne linki

Słownik AI

Kompletny słownik sztucznej inteligencji

162
kategorie
2 032
podkategorie
23 060
pojęcia
📖
pojęcia

Language Model

Statistical or neural system that calculates the probability of word sequences appearing in a language. These models learn contextual and syntactic dependencies from large text corpora to generate or evaluate natural language.

📖
pojęcia

Transformer Architecture

Neural architecture based on attention mechanisms that processes sequences in parallel without temporal dependencies. Transformers have revolutionized language models thanks to their ability to capture long-distance dependencies.

📖
pojęcia

GPT

Family of generative language models based on a Transformer architecture decoding only from left to right. GPT models specialize in generating coherent text and sequence completion.

📖
pojęcia

N-grams

Statistical language models based on the Markovian hypothesis that the probability of a word depends only on the previous n-1 words. N-grams constitute the classic approach to language modeling before the neural network era.

📖
pojęcia

RNN

Recurrent neural network that processes sequences by maintaining a hidden state that evolves at each token. RNNs were among the first neural architectures applied to language models to capture temporal dependencies.

📖
pojęcia

LSTM

Advanced variant of RNN using forget gates and memory gates to manage long-term dependencies. LSTMs overcame the vanishing gradient problem of traditional RNNs in language modeling applications.

📖
pojęcia

Causal Language Models

Models trained to predict the next word based only on previous words in the sequence. Causal models are particularly suited for text generation and completion tasks.

📖
pojęcia

Masked Language Models

Models trained to predict masked words in a sequence using the full bidirectional context. This approach allows for better context understanding for analysis and classification tasks.

🔍

Nie znaleziono wyników