🏠 Home
Benchmark Hub
📊 All Benchmarks 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List Applications 🎨 Creative Free Pages 🎯 FSACB - Ultimate Showcase 🌍 Translation Benchmark
Models
🏆 Top 10 Models 🆓 Free Models 📋 All Models ⚙️ Kilo Code
Resources
💬 Prompts Library 📖 AI Glossary 🔗 Useful Links

AI Glossary

The complete dictionary of Artificial Intelligence

162
categories
2,032
subcategories
23,060
terms
📖
terms

Knowledge Distillation

Technique where a large diffusion model (teacher) trains a smaller and faster model (student) to reproduce its outputs, thereby reducing the computational cost of inference.

📖
terms

Progressive Distillation

Iterative distillation method where each student model learns from a slightly faster teacher model, enabling exponential acceleration of the sampling process.

📖
terms

Scheduler

Algorithm defining the sequence of noise levels and time steps for the denoising process, directly influencing the speed and quality of generation.

📖
terms

DDIM Scheduler (Denoising Diffusion Implicit Models)

Deterministic scheduler that allows generating samples with far fewer steps than a standard scheduler by modifying the stochasticity of the denoising process.

📖
terms

DPM-Solver Scheduler

High-order stochastic differential equation solver designed to accelerate diffusion model sampling, achieving high quality with very few evaluation steps.

📖
terms

One-Step Sampling

Distillation objective where the student model is trained to generate a clean output from a noisy input in a single denoising step.

📖
terms

Latent Resampling

Acceleration strategy that modifies the denoising trajectory in latent space by resampling intermediate states to reduce the total number of required steps.

📖
terms

Consistency Models

Family of generative models trained to map any point on a noise trajectory directly to the starting point of the trajectory, enabling sampling in a single step or very few steps.

📖
terms

Trajectory Distillation

A distillation variant where the student model learns to imitate the complete denoising trajectory of the teacher model across multiple steps, rather than focusing on a single step.

📖
terms

Linear Multistep Scheduler

Scheduler based on linear multistep methods, optimized for fast and stable convergence with a low number of sampling steps.

📖
terms

Token Merging Acceleration

Technique that reduces the computational complexity of the denoiser (often a U-Net Transformer) by semantically merging similar tokens at each denoising step.

📖
terms

Uniform Denoising

Sampling strategy that uses uniformly spaced timesteps, often combined with advanced solvers to maximize efficiency with a reduced number of steps.

📖
terms

Adversarial Distillation

Approach where a discriminator network is used to help the student model learn the distribution characteristics of the teacher model, improving the fidelity of fast generation.

📖
terms

Karras Scheduler

Noise scheduler that defines a smoother and more continuous noise variance, often used to improve the stability and quality of sampling with a low number of steps.

📖
terms

Accelerated Stochastic Sampling

Methods that retain a stochastic component in the denoising process while using advanced schedulers or solvers to reduce the number of steps, balancing diversity and speed.

🔍

No results found