GPT Architecture
Decoder-Only Architecture
Transformer model structure that eliminates encoders to focus solely on the decoder, optimized for text generation using masked attention to prevent future information leakage.
← Terug