🏠 Hem
Benchmarkar
📊 Alla benchmarkar 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List-applikationer 🎨 Kreativa fria sidor 🎯 FSACB - Ultimata uppvisningen 🌍 Översättningsbenchmark
Modeller
🏆 Topp 10 modeller 🆓 Gratis modeller 📋 Alla modeller ⚙️ Kilo Code
Resurser
💬 Promptbibliotek 📖 AI-ordlista 🔗 Användbara länkar

AI-ordlista

Den kompletta ordlistan över AI

162
kategorier
2 032
underkategorier
23 060
termer
📖
termer

Knowledge Distillation

Technique where a large diffusion model (teacher) trains a smaller and faster model (student) to reproduce its outputs, thereby reducing the computational cost of inference.

📖
termer

Progressive Distillation

Iterative distillation method where each student model learns from a slightly faster teacher model, enabling exponential acceleration of the sampling process.

📖
termer

Scheduler

Algorithm defining the sequence of noise levels and time steps for the denoising process, directly influencing the speed and quality of generation.

📖
termer

DDIM Scheduler (Denoising Diffusion Implicit Models)

Deterministic scheduler that allows generating samples with far fewer steps than a standard scheduler by modifying the stochasticity of the denoising process.

📖
termer

DPM-Solver Scheduler

High-order stochastic differential equation solver designed to accelerate diffusion model sampling, achieving high quality with very few evaluation steps.

📖
termer

One-Step Sampling

Distillation objective where the student model is trained to generate a clean output from a noisy input in a single denoising step.

📖
termer

Latent Resampling

Acceleration strategy that modifies the denoising trajectory in latent space by resampling intermediate states to reduce the total number of required steps.

📖
termer

Consistency Models

Family of generative models trained to map any point on a noise trajectory directly to the starting point of the trajectory, enabling sampling in a single step or very few steps.

📖
termer

Trajectory Distillation

A distillation variant where the student model learns to imitate the complete denoising trajectory of the teacher model across multiple steps, rather than focusing on a single step.

📖
termer

Linear Multistep Scheduler

Scheduler based on linear multistep methods, optimized for fast and stable convergence with a low number of sampling steps.

📖
termer

Token Merging Acceleration

Technique that reduces the computational complexity of the denoiser (often a U-Net Transformer) by semantically merging similar tokens at each denoising step.

📖
termer

Uniform Denoising

Sampling strategy that uses uniformly spaced timesteps, often combined with advanced solvers to maximize efficiency with a reduced number of steps.

📖
termer

Adversarial Distillation

Approach where a discriminator network is used to help the student model learn the distribution characteristics of the teacher model, improving the fidelity of fast generation.

📖
termer

Karras Scheduler

Noise scheduler that defines a smoother and more continuous noise variance, often used to improve the stability and quality of sampling with a low number of steps.

📖
termer

Accelerated Stochastic Sampling

Methods that retain a stochastic component in the denoising process while using advanced schedulers or solvers to reduce the number of steps, balancing diversity and speed.

🔍

Inga resultat hittades