YZ Sözlüğü
Yapay Zekanın tam sözlüğü
Expected Calibration Error (ECE)
Quantitative metric that evaluates the calibration of a model by calculating the weighted average difference between predicted confidence and observed accuracy over discrete confidence intervals.
Maximum Calibration Error (MCE)
Calibration metric that identifies the maximum deviation between confidence and accuracy across all confidence intervals, useful for detecting worst-case miscalibration.
Adaptive Calibration Error (ACE)
Variant of ECE that uses adaptive confidence intervals with an equal number of samples per bin, reducing variance and the influence of the number of bins on the measurement.
Static Calibration Error (SCE)
Calibration metric that evaluates the average calibration across all bins without weighting by the distribution of predictions, giving equal weight to all confidence intervals.
Native Calibration
Calibration obtained directly during model training through techniques like label smoothing or specific loss functions, without requiring a separate calibration step.
Confidence Histogram
Frequency distribution of predictions organized by confidence intervals, used to analyze model behavior and identify regions of miscalibration.
Binning Strategy
Method of partitioning the confidence space into discrete intervals for calculating calibration metrics, influencing the precision and stability of error estimates.