Słownik AI
Kompletny słownik sztucznej inteligencji
Partial Dependence Plot
Visualization that shows the marginal relationship between one or two features and the model's prediction, by averaging the effects of all other features to reveal underlying trends. This technique helps understand how changes in specific variables affect the model's overall predictions.
Feature Heatmap
Two-dimensional graphical representation using colors to visualize the importance or influence of different features on the model's predictions. Color intensities indicate the degree of impact, allowing quick identification of the most influential variables.
SHAP Diagram
Visualization based on SHAP (SHapley Additive exPlanations) values that shows how each feature contributes to shifting the prediction from the base value to the final prediction. These plots allow individual interpretation of each prediction by quantifying the positive or negative impact of each variable.
ICE Plot (Individual Conditional Expectation)
Visualization that plots the model's predictions for individual instances while varying a specific feature, revealing heterogeneities in variable relationships. Unlike PDP plots that show average effects, ICE plots show variations at the individual level.
LIME Decision Map
Visualization generated by LIME (Local Interpretable Model-agnostic Explanations) that explains individual predictions by creating a locally interpretable model and visualizing important features. This technique helps understand why a model makes a specific decision for a particular case.
Permutation Importance Visualization
Bar chart representing feature importance calculated by measuring the decrease in model performance when a feature's values are randomly permuted. This method provides a model-agnostic evaluation of variable importance based on their actual impact on predictions.
ROC Curve and AUC
Graph showing the true positive rate versus the false positive rate for different classification thresholds, with AUC (Area Under Curve) quantifying the overall classifier performance. This visualization allows evaluating and comparing models' discriminative ability regardless of the chosen threshold.
Waterfall Diagram
Sequential visualization showing how contributions from different features accumulate to reach the model's final prediction, similar to a financial waterfall chart. Each bar represents the positive or negative impact of a feature on the final prediction.
Local Force Plot
Visual representation showing how each feature pushes a model's prediction towards a higher or lower value compared to the expected baseline value. Forces are represented by colored arrows or bars indicating the direction and magnitude of influence.
Adjustable Decision Visualization
Interactive interface allowing dynamic adjustment of input feature values and real-time observation of how these changes affect model predictions. These tools facilitate exploration of complex relationships and model sensitivity analysis.
Anchor Decision Map
Visualization based on Anchor explanations that identifies sufficient conditions for a model to maintain its prediction, presented as if-then rules. These maps help understand local decision regions by showing critical features that anchor a prediction.
Dependency Diagram
Two-dimensional or three-dimensional graph illustrating relationships and dependencies between input features and model predictions, often combined with point densities. These diagrams reveal nonlinear interactions and complex correlations in the data.
Decision Contour Plot
Two or three-dimensional visualization showing a classification model's decision boundaries through colored contour lines or surfaces. These plots help understand how the model partitions feature space to perform classifications.
Counterfactual Visualization
Graphical representation showing the minimal changes required to input features to change the model's prediction to a different desired outcome. These visualizations help understand boundary conditions and explore hypothetical scenarios.
Regression Tree
Variant of decision tree specialized for regression problems, visualizing how the model partitions feature space into regions with constant target values. Each leaf represents the mean of target values in that region, facilitating interpretation of nonlinear relationships.
Decision Flow Diagram
Schematic representation of a model's sequential decision process, showing decision points, possible actions, and expected outcomes in a logical flow. These diagrams transform complex algorithms into understandable step-by-step processes.
Sensitivity Map
Grid or surface visualization showing how model predictions systematically vary when two features are modified simultaneously. These maps reveal interactions between variables and identify regions of high sensitivity in the feature space.
Global Importance Visualization
Synthetic graph presenting the ranking of features by order of importance for the entire model, often in the form of horizontal bars with normalized scores. This overview allows for quick identification of the most influential factors in the model's global predictions.