AI 용어집
인공지능 완전 사전
Bayesian Optimization
Probabilistic method using Gaussian processes to model the objective function and efficiently guide the hyperparameter search.
Hyperband and Successive Halving
Dynamic resource allocation approaches that early eliminate poor configurations to optimize computational budget utilization.
Multi-Objective Optimization
Advanced techniques for simultaneously optimizing conflicting performance metrics (e.g., accuracy vs inference time).
BOHB (Bayesian Optimization HyperBand)
Hybrid combination of Bayesian optimization and Hyperband for efficient search with adaptive resource allocation.
Neural Architecture Search (NAS)
Automatic optimization of neural network structure (depth, width, connections) in addition to classical hyperparameters.
Meta-Learning for Hyperparameters
Approach that learns from previous optimizations to intelligently initialize the search on new tasks.
Evolutionary Algorithms Optimization
Methods inspired by natural evolution using mutation, crossover, and selection to explore the hyperparameter space.
Population-Based Training (PBT)
Competitive evolutionary technique that exploits and explores simultaneously by adapting hyperparameters during training.
Constrained Optimization
Advanced methods handling restrictions on hyperparameters (bounds, dependencies, resource constraints).
Parallel and Distributed Optimization
Parallelization strategies to accelerate optimization by distributing evaluations across multiple computational resources
Gradient-based Hyperparameter Optimization
Differentiable approach that optimizes certain hyperparameters through gradient descent rather than discrete search.
Auto-Sklearn and AutoML Frameworks
Complete systems automating hyperparameter optimization, model selection, and data preprocessing.