AI-ordlista
Den kompletta ordlistan över AI
Grid Search
Exhaustive optimization method that systematically evaluates all possible combinations of hyperparameters on a predefined grid. This approach guarantees finding the optimal configuration but is often inefficient for high-dimensional spaces.
Random Search
Optimization technique that randomly samples hyperparameter combinations according to specified distributions. It proves more efficient than grid search for high-dimensional spaces by focusing exploration on relevant areas.
BOHB
Hybrid combination of Bayesian Optimization and Hyperband that integrates a TPE model to guide the selection of configurations within an adaptive resource allocation framework. This synergistic method combines the search efficiency of Bayesian optimization with Hyperband's rapid elimination.
Tree-structured Parzen Estimator
Variant of Bayesian optimization that separately models the hyperparameter distributions for good and bad configurations. The algorithm preferentially samples in regions where high-performing configurations are more likely.
Genetic Algorithm
Optimization method inspired by natural evolution that evolves a population of configurations through selection, crossover, and mutation. It is particularly well-suited for discrete search spaces and problems with multiple local optima.
Particle Swarm Optimization
Metaheuristic technique that simulates the social behavior of a swarm to explore the search space. Each particle adjusts its trajectory based on its own best personal experience and that of the best-performing neighborhood.
Conditional Hyperparameters
Hyperparameters whose existence or value range depends on the values of other hyperparameters, creating a dependency structure in the search space. Their management requires optimization strategies adapted to hierarchical spaces.
Multi-objective Optimization
Extension of hyperparameter optimization that simultaneously handles multiple, often conflicting, objectives like accuracy and latency. It produces a Pareto front of optimal solutions representing different possible trade-offs.
Transfer Learning for Hyperparameters
Technique that reuses knowledge about hyperparameter performance acquired from previous tasks or datasets. This approach significantly speeds up optimization on new similar tasks.
Neuroevolution
Application of evolutionary algorithms to the optimization of neural network architectures and weights. It combines the flexibility of evolution with the power of deep networks to discover unconventional solutions.
Gradient-Based Optimization
Approach that treats hyperparameters as optimizable parameters and calculates their gradient with respect to the model's loss. It enables efficient directional updates but requires differentiable objective functions.
Hierarchical Search Space
Search space structure where hyperparameters are organized in dependency levels, reflecting conditional relationships between parameters. Optimization must respect these structural constraints to generate valid configurations.
Robust Optimization
Paradigm that seeks hyperparameters offering good performance not only on the training dataset but also against future variations and noise. It prioritizes stability and generalization over aggressive optimization.