AI-ordlista
Den kompletta ordlistan över AI
Knowledge Distillation
Technique where a large diffusion model (teacher) trains a smaller and faster model (student) to reproduce its outputs, thereby reducing the computational cost of inference.
Progressive Distillation
Iterative distillation method where each student model learns from a slightly faster teacher model, enabling exponential acceleration of the sampling process.
Scheduler
Algorithm defining the sequence of noise levels and time steps for the denoising process, directly influencing the speed and quality of generation.
DDIM Scheduler (Denoising Diffusion Implicit Models)
Deterministic scheduler that allows generating samples with far fewer steps than a standard scheduler by modifying the stochasticity of the denoising process.
DPM-Solver Scheduler
High-order stochastic differential equation solver designed to accelerate diffusion model sampling, achieving high quality with very few evaluation steps.
One-Step Sampling
Distillation objective where the student model is trained to generate a clean output from a noisy input in a single denoising step.
Latent Resampling
Acceleration strategy that modifies the denoising trajectory in latent space by resampling intermediate states to reduce the total number of required steps.
Consistency Models
Family of generative models trained to map any point on a noise trajectory directly to the starting point of the trajectory, enabling sampling in a single step or very few steps.
Trajectory Distillation
A distillation variant where the student model learns to imitate the complete denoising trajectory of the teacher model across multiple steps, rather than focusing on a single step.
Linear Multistep Scheduler
Scheduler based on linear multistep methods, optimized for fast and stable convergence with a low number of sampling steps.
Token Merging Acceleration
Technique that reduces the computational complexity of the denoiser (often a U-Net Transformer) by semantically merging similar tokens at each denoising step.
Uniform Denoising
Sampling strategy that uses uniformly spaced timesteps, often combined with advanced solvers to maximize efficiency with a reduced number of steps.
Adversarial Distillation
Approach where a discriminator network is used to help the student model learn the distribution characteristics of the teacher model, improving the fidelity of fast generation.
Karras Scheduler
Noise scheduler that defines a smoother and more continuous noise variance, often used to improve the stability and quality of sampling with a low number of steps.
Accelerated Stochastic Sampling
Methods that retain a stochastic component in the denoising process while using advanced schedulers or solvers to reduce the number of steps, balancing diversity and speed.