YZ Sözlüğü
Yapay Zekanın tam sözlüğü
Attention Map
Graphical visualization that represents the attention weights assigned by a neural model to different parts of the input during the decision-making process.
Saliency Map
Heat map that indicates the most influential pixels or regions of an image for a model's prediction, calculated via gradients of the output with respect to the input.
Self-Attention Visualization
Visual representation of intra-sequence attention relationships in Transformer models, showing how each token interacts with other tokens in the same sequence.
Class Activation Mapping
Technique that generates discriminative localization maps using information from the final convolutional layers without requiring architectural modifications.
Token-wise Attention
Granular analysis of attention weights at the individual token level in natural language processing models.
Feature Attribution
Quantitative process that assigns importance scores to input features to explain their individual contribution to the model's prediction.
Attention Head Importance
Quantitative evaluation of the contribution of each attention head in a multi-head model to the overall system performance.