🏠 Hem
Benchmarkar
📊 Alla benchmarkar 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List-applikationer 🎨 Kreativa fria sidor 🎯 FSACB - Ultimata uppvisningen 🌍 Översättningsbenchmark
Modeller
🏆 Topp 10 modeller 🆓 Gratis modeller 📋 Alla modeller ⚙️ Kilo Code
Resurser
💬 Promptbibliotek 📖 AI-ordlista 🔗 Användbara länkar

AI-ordlista

Den kompletta ordlistan över AI

162
kategorier
2 032
underkategorier
23 060
termer
📖
termer

Variational Autoencoder

Generative neural network architecture that learns a probabilistic latent representation of input data by maximizing a lower bound on the log-likelihood.

📖
termer

ELBO (Evidence Lower Bound)

Objective function maximized in variational learning, combining reconstruction loss and KL regularization as a lower bound on the marginal log-likelihood.

📖
termer

Inference Posterior

Approximate distribution q(z|x) learned by the encoder to estimate the true posterior p(z|x), typically modeled as a diagonal Gaussian.

📖
termer

Convolutional VAE

VAE variant using convolutional layers to efficiently process image data, preserving spatial structure and improving reconstruction quality.

📖
termer

Hierarchical VAE

Multi-level architecture where latent variables are organized hierarchically, enabling more expressive modeling and progressive generation.

📖
termer

Continuous Latent Space

Representation space where each point corresponds to a valid sample and transitions between points are smooth, enabling interpolation and semantic manipulation.

📖
termer

Disentanglement Factor

Property of the latent space where each dimension captures an independent factor of variation in the data, facilitating interpretability and generative control.

📖
termer

Reparameterized Sampling

Sampling process enabling differentiability by separating sources of randomness from learnable parameters, essential for VAE training.

📖
termer

Conditional Generation

Extension of VAEs where generation is conditioned by additional information such as labels or attributes, allowing targeted control over outputs.

📖
termer

Variational Learning

Optimization paradigm approximating Bayesian inference through ELBO maximization, underlying theoretical foundation for training VAEs.

📖
termer

Distribution Regularization

Mechanism forcing the learned latent distribution to follow a specific shape (typically Gaussian), avoiding posterior collapse and ensuring a structured latent space.

🔍

Inga resultat hittades