Local interpretation methods
Shapley Values
The fundamental theoretical concept of SHAP, representing the average marginal contribution of a feature across all possible feature coalitions in a model, ensuring a fair distribution of importance.
← Indietro