Glossario IA
Il dizionario completo dell'Intelligenza Artificiale
Matrix Factorization
Algebraic technique that decomposes a user-item matrix into the product of two lower-rank matrices to reveal latent features of preferences.
Singular Value Decomposition (SVD)
Factorization method that decomposes a matrix M into UΣV' where U and V are orthogonal and Σ is diagonal, enabling optimal dimensional reduction.
Latent Factors
Unobservable hidden variables representing the intrinsic characteristics of users and items, learned automatically during factorization.
Stochastic Gradient Descent (SGD)
Iterative optimization algorithm that updates factorization parameters using a random sample at each iteration to minimize prediction error.
Alternating Least Squares (ALS)
Optimization method that alternates between fixing one factor matrix to analytically solve for the other, guaranteeing convergence to a local optimum.
Regularization
Technique that prevents overfitting by adding a penalty on the magnitude of parameters, favoring more general and robust solutions.
Vectorization
Process of representing entities (users/items) as dense vectors in a reduced-dimension latent space.
Non-Negative Matrix Factorization (NMF)
Factorization variant that constrains all resulting matrices to contain only non-negative values, improving the interpretability of factors.
User and Item Bias
Additional terms capturing systematic tendencies of users (general tendencies to rate high/low) and items (intrinsic popularity).
Pairwise Learning
Approach that directly optimizes the relative ranking of items by considering pairs (positive item, negative item) rather than absolute ratings.
Cold Start Problem
Major challenge where factorization fails to generate reliable recommendations for new users or items lacking interaction history.
Tensor Factorization
Multidimensional extension of matrix factorization that allows modeling multiple dimensions simultaneously (user, item, context, time).
Deep Learning for Factorization
Integration of neural networks to capture complex non-linear relationships between latent factors, improving recommendation accuracy.
Loss Function
Measure quantifying the gap between the predictions of the factorized model and actual values, serving as an objective to minimize during training.
Learning Rate
Hyperparameter controlling the magnitude of parameter updates during optimization, influencing the speed and stability of convergence.
Hybrid Embedding
Combination of matrix factorization with content-based embeddings, merging collaborative and content-based approaches.