Sampling Acceleration
Knowledge Distillation
Technique where a large diffusion model (teacher) trains a smaller and faster model (student) to reproduce its outputs, thereby reducing the computational cost of inference.
← Tillbaka