🏠 Home
Benchmark Hub
📊 All Benchmarks 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List Applications 🎨 Creative Free Pages 🎯 FSACB - Ultimate Showcase 🌍 Translation Benchmark
Models
🏆 Top 10 Models 🆓 Free Models 📋 All Models ⚙️ Kilo Code
Resources
💬 Prompts Library 📖 AI Glossary 🔗 Useful Links

AI Glossary

The complete dictionary of Artificial Intelligence

162
categories
2,032
subcategories
23,060
terms
📖
terms

Neural Architecture Optimization (NAS)

Process of automating the design of optimal neural network architectures for a given task, exploring a vast search space of topologies and hyperparameters.

📖
terms

Architecture Search Space

Set of all possible neural network architectures, defined by constraints such as number of layers, operation types, and connection patterns.

📖
terms

Surrogate Model

Statistical model approximating the expensive performance function to evaluate (training a neural network) to accelerate the optimization process.

📖
terms

Expected Improvement (EI)

Acquisition function criterion that selects the next point to evaluate by maximizing the expectation of improvement over the current best performance.

📖
terms

Reinforcement Learning-based NAS

NAS approach where a controller, often a recurrent network, learns to generate neural network architectures by maximizing a performance reward.

📖
terms

Evolutionary NAS

NAS method inspired by biological evolution, using mutation and crossover operators on a population of architectures to find better ones.

📖
terms

Low-Fidelity Evaluation

Strategy for estimating architecture performance using reduced data, fewer training epochs, or a subset of the dataset to reduce costs.

📖
terms

Gradient-Based NAS

NAS technique that relaxes the discrete architecture selection problem into a continuous problem, allowing gradient descent to optimize architecture weights.

📖
terms

Hypernetwork (Hypernetwork)

A neural network whose weights are generated by another network (the hypernetwork), allowing for parameterization and optimization of a family of architectures.

📖
terms

Architecture Cell

A repeatable building block in a neural network architecture, whose internal structure is optimized by NAS and then stacked to form the final model.

📖
terms

Multi-Objective Optimization (Multi-Objective NAS)

A variant of NAS aimed at simultaneously optimizing multiple metrics, such as accuracy, latency, or energy consumption, to find optimal trade-offs.

📖
terms

Tree-structured Parzen Estimator Method (TPE)

A Bayesian optimization algorithm that models the distribution of good and bad configurations using Parzen tree models to guide the search.

📖
terms

Bandit Learning (Bandit-Based NAS)

A NAS approach treating the selection of architecture components as a multi-armed bandit problem, balancing exploration and exploitation to build the model.

📖
terms

Performance Proxy

A low-cost metric or model used to estimate the final performance of an architecture, avoiding a full and lengthy training phase during the search stage.

📖
terms

Reduced Search Space

A strategy involving limiting the architecture search space to predefined blocks or patterns to accelerate the convergence of the NAS algorithm.

📖
terms

Weight Sharing Between Architectures (Weight Sharing)

A technique where the weights of a neural network are shared between multiple candidate architectures being evaluated, drastically reducing the computational cost of NAS search.

🔍

No results found