Transformer Optimization
Reversible Layers
Transformer layers designed to allow reconstruction of activations from outputs, eliminating the need to store intermediate activations.
← TerugTransformer layers designed to allow reconstruction of activations from outputs, eliminating the need to store intermediate activations.
← Terug