🏠 홈
벤치마크
📊 모든 벤치마크 🦖 공룡 v1 🦖 공룡 v2 ✅ 할 일 목록 앱 🎨 창의적인 자유 페이지 🎯 FSACB - 궁극의 쇼케이스 🌍 번역 벤치마크
모델
🏆 톱 10 모델 🆓 무료 모델 📋 모든 모델 ⚙️ 킬로 코드 모드
리소스
💬 프롬프트 라이브러리 📖 AI 용어 사전 🔗 유용한 링크

AI 용어집

인공지능 완전 사전

162
카테고리
2,032
하위 카테고리
23,060
용어
📖
용어

Language Model

Statistical or neural system that calculates the probability of word sequences appearing in a language. These models learn contextual and syntactic dependencies from large text corpora to generate or evaluate natural language.

📖
용어

Transformer Architecture

Neural architecture based on attention mechanisms that processes sequences in parallel without temporal dependencies. Transformers have revolutionized language models thanks to their ability to capture long-distance dependencies.

📖
용어

GPT

Family of generative language models based on a Transformer architecture decoding only from left to right. GPT models specialize in generating coherent text and sequence completion.

📖
용어

N-grams

Statistical language models based on the Markovian hypothesis that the probability of a word depends only on the previous n-1 words. N-grams constitute the classic approach to language modeling before the neural network era.

📖
용어

RNN

Recurrent neural network that processes sequences by maintaining a hidden state that evolves at each token. RNNs were among the first neural architectures applied to language models to capture temporal dependencies.

📖
용어

LSTM

Advanced variant of RNN using forget gates and memory gates to manage long-term dependencies. LSTMs overcame the vanishing gradient problem of traditional RNNs in language modeling applications.

📖
용어

Causal Language Models

Models trained to predict the next word based only on previous words in the sequence. Causal models are particularly suited for text generation and completion tasks.

📖
용어

Masked Language Models

Models trained to predict masked words in a sequence using the full bidirectional context. This approach allows for better context understanding for analysis and classification tasks.

🔍

결과를 찾을 수 없습니다