Glossario IA
Il dizionario completo dell'Intelligenza Artificiale
Hyperparameter
External parameter to the model whose value must be set before training, directly influencing the learning capacity and final performance of the algorithm.
Underfitting
Situation where the model is too simple to capture the underlying structure of the data, resulting in poor performance on both training and test sets.
Grid Search
Systematic optimization method that exhaustively explores all possible combinations of hyperparameters in a predefined grid to identify the best configuration.
Random Search
Optimization approach that randomly samples hyperparameter combinations in the search space, often more efficient than Grid Search for high-dimensional spaces.
Bayesian Optimization
Sequential optimization method using a probabilistic model to guide the search for optimal hyperparameters by balancing exploration and exploitation.
Validation Score
Quantitative metric evaluating model performance on the validation set, serving as an objective criterion for hyperparameter selection and tuning.
Ensemble Learning
Paradigm combining multiple base models to improve overall predictive performance through aggregation of their individual predictions.
Boosting
Sequential ensemble method where each model learns from the errors of the previous one, creating a powerful composite through adaptive weighting of weak learners.
Evaluation Metric
Quantitative indicator measuring model performance according to specific criteria (accuracy, recall, F1-score, AUC-ROC, MSE, MAE, etc.).
Elbow Point
Inflection point on a performance curve where the marginal improvement becomes negligible, often indicating the optimal value of a hyperparameter.