Self-Attention
Encoder-Decoder Attention
Mechanism where the decoder attends to encoder outputs, enabling generation of sequences conditioned on a source sequence in seq2seq models.
← BackMechanism where the decoder attends to encoder outputs, enabling generation of sequences conditioned on a source sequence in seq2seq models.
← Back