🏠 Home
Benchmark
📊 Tutti i benchmark 🦖 Dinosauro v1 🦖 Dinosauro v2 ✅ App To-Do List 🎨 Pagine libere creative 🎯 FSACB - Ultimate Showcase 🌍 Benchmark traduzione
Modelli
🏆 Top 10 modelli 🆓 Modelli gratuiti 📋 Tutti i modelli ⚙️ Kilo Code
Risorse
💬 Libreria di prompt 📖 Glossario IA 🔗 Link utili

Glossario IA

Il dizionario completo dell'Intelligenza Artificiale

162
categorie
2.032
sottocategorie
23.060
termini
📖
termini

Language Model

Statistical or neural system that calculates the probability of word sequences appearing in a language. These models learn contextual and syntactic dependencies from large text corpora to generate or evaluate natural language.

📖
termini

Transformer Architecture

Neural architecture based on attention mechanisms that processes sequences in parallel without temporal dependencies. Transformers have revolutionized language models thanks to their ability to capture long-distance dependencies.

📖
termini

GPT

Family of generative language models based on a Transformer architecture decoding only from left to right. GPT models specialize in generating coherent text and sequence completion.

📖
termini

N-grams

Statistical language models based on the Markovian hypothesis that the probability of a word depends only on the previous n-1 words. N-grams constitute the classic approach to language modeling before the neural network era.

📖
termini

RNN

Recurrent neural network that processes sequences by maintaining a hidden state that evolves at each token. RNNs were among the first neural architectures applied to language models to capture temporal dependencies.

📖
termini

LSTM

Advanced variant of RNN using forget gates and memory gates to manage long-term dependencies. LSTMs overcame the vanishing gradient problem of traditional RNNs in language modeling applications.

📖
termini

Causal Language Models

Models trained to predict the next word based only on previous words in the sequence. Causal models are particularly suited for text generation and completion tasks.

📖
termini

Masked Language Models

Models trained to predict masked words in a sequence using the full bidirectional context. This approach allows for better context understanding for analysis and classification tasks.

🔍

Nessun risultato trovato