Language models
Transformer Architecture
Neural architecture based on attention mechanisms that processes sequences in parallel without temporal dependencies. Transformers have revolutionized language models thanks to their ability to capture long-distance dependencies.
← Wstecz