Attention Visualization
Self-Attention Matrix
Square matrix representing attention weights between all pairs of tokens in the same sequence, where each element (i,j) indicates the influence of token j on token i.
← Zurück