🏠 Home
Benchmark
📊 Tutti i benchmark 🦖 Dinosauro v1 🦖 Dinosauro v2 ✅ App To-Do List 🎨 Pagine libere creative 🎯 FSACB - Ultimate Showcase 🌍 Benchmark traduzione
Modelli
🏆 Top 10 modelli 🆓 Modelli gratuiti 📋 Tutti i modelli ⚙️ Kilo Code
Risorse
💬 Libreria di prompt 📖 Glossario IA 🔗 Link utili

Glossario IA

Il dizionario completo dell'Intelligenza Artificiale

162
categorie
2.032
sottocategorie
23.060
termini
📖
termini

Generative Adversarial Network

Unsupervised learning architecture composed of two competing neural networks, a generator and a discriminator, that compete against each other to generate realistic synthetic data.

📖
termini

Minimax Loss

Original objective function of GANs where the generator minimizes the log-probability of the discriminator being wrong, while the discriminator maximizes the probability of correct classification.

📖
termini

Latent Space

Reduced dimensional vector space where the generator samples random noise to create data, allowing semantic control over the generated features.

📖
termini

StyleGAN

Advanced GAN architecture using a mapping network and AdaIN modules to control hierarchical styles of generated features at different resolutions.

📖
termini

Jensen-Shannon Distance

Symmetric and bounded divergence metric used in original GANs to measure the difference between real and generated data distributions.

📖
termini

Gradient Penalty

Regularization term added to WGAN loss function to constrain discriminator gradients, ensuring the continuity of the Lipschitz transform.

📖
termini

Nash Equilibrium

Optimal point where neither the generator nor the discriminator can improve their performance by unilaterally modifying their parameters, indicating training convergence.

📖
termini

Encoder Network

Additional component in BiGAN or ALI variants that learns to map real data to latent space, enabling latent inference and reconstruction.

📖
termini

Cycle Consistency Loss

Additional loss function in CycleGANs ensuring content preservation during translations between unpaired domains via back-and-forth cycles.

📖
termini

Spectral Normalization

Regularization technique constraining the spectral norm of discriminator weights, stabilizing GAN training by controlling the Lipschitz transform.

📖
termini

Progressive Growing

Training strategy where networks start with low-resolution images and progressively add layers to increase resolution, stabilizing convergence.

📖
termini

Variational Auto-Encoder

Hybrid architecture combining VAE and GAN where the VAE ensures latent space coverage and the GAN improves visual quality of generated samples.

📖
termini

Fréchet Inception Distance

Quantitative evaluation metric measuring the similarity between Inception feature distributions of real and generated images via the Fréchet distance.

🔍

Nessun risultato trovato