YZ Sözlüğü
Yapay Zekanın tam sözlüğü
Permutation Importance
Feature importance evaluation technique that measures the degradation in model performance when the values of a variable are randomly permuted, thereby breaking its relationship with the target.
Partial Dependence Plot (PDP)
Visualization showing the average marginal effect of one or two features on the model's prediction, while integrating the effects of other variables.
Accumulated Local Effects (ALE)
Alternative method to PDP that calculates feature effects while accounting for correlations between variables, thus avoiding biases present in partial dependence plots.
Global Surrogate Model
Simple and interpretable model (such as a decision tree or linear regression) trained to approximate the global behavior of a complex model across the entire dataset.
Local Surrogate Model
Interpretable model trained specifically to approximate the predictions of a complex model in a restricted neighborhood around a particular instance.
Individual Conditional Expectation (ICE)
Visualization that plots the model's prediction for each individual instance while varying a specific feature, revealing heterogeneity in effects beyond the average.
Gradient SHAP
SHAP variant that combines gradient methods with reference samples to efficiently approximate SHAP values in deep learning models.
Layer-wise Relevance Propagation (LRP)
Backward propagation technique that redistributes the neural network's final prediction to the input features by passing through each layer, preserving the total sum of relevance.
Feature Contribution
Quantitative measure of the individual impact of each feature on the final prediction, often expressed as a difference from a reference or baseline value.
Grad-CAM (Gradient-weighted Class Activation Mapping)
Visualization technique for convolutional neural networks that generates heat maps locating important regions for a specific prediction using gradients from the final layer.
SHAP Interaction Values
Extension of SHAP values that decomposes not only the importance of individual features but also their interaction effects, quantifying how pairs of variables collectively influence the prediction.