Efficient Transformers
Set Transformer
Permutation-invariant architecture based on attention to process data sets without predefined order. Set Transformer uses induced attention blocks and pooling mechanisms for set operations.
← Tillbaka