Glossario IA
Il dizionario completo dell'Intelligenza Artificiale
FedAvg (Federated Averaging)
Fundamental aggregation algorithm in federated learning that calculates the weighted average of local model weights based on client dataset sizes to create a global model.
FedProx (Federated Proximal)
Extension of FedAvg adding a proximal regularization term to constrain local updates to remain close to the global model, thereby reducing client heterogeneity.
SCAFFOLD (Stochastic Controlled Averaging)
Advanced aggregation algorithm using control variables to correct client-server drift biases and reduce the impact of data heterogeneity.
FedBN (Federated Batch Normalization)
Aggregation method maintaining local batch normalization statistics specific to each client while sharing other model parameters.
FedOpt (Federated Optimization)
Family of algorithms using advanced server optimizers (Adam, Yogi) to improve convergence in non-IID federated learning scenarios.
FedMA (Federated Matching Averaging)
Neural aggregation algorithm that matches and averages similar neurons between local models instead of directly aggregating weights.
FedNova (Federated Normalized Averaging)
Method normalizing local updates by the number of local optimization steps to correct aggregation biases in heterogeneous environments.
FedYogi
Adaptive optimizer for federated learning combining FedAvg with the Yogi algorithm for better adaptation to non-IID data distributions.
FedAdam
Variant of FedAvg incorporating the Adam optimizer on the server side to dynamically manage learning rates and improve convergence.
FedPer (Federated Personalization)
Architecture dividing the model into a global base and local personalization layers, allowing specific adaptation for each client.
FedRep (Federated Representation Learning)
Method separating the learning of representations (global) and classifiers (local) to optimize performance on heterogeneous data.
FedCurv (Federated Curvature)
Algorithm incorporating Fisher curvature information to improve aggregation in scenarios with strong client heterogeneity.
FedSGD (Federated Stochastic Gradient Descent)
Basic variant where clients perform a single gradient pass before aggregation, reducing local computation but increasing communication.
FedDist (Federated Distillation)
Aggregation method based on knowledge distillation where clients share their softmax outputs rather than model weights.
FedAdagrad
Combination of FedAvg with the Adagrad optimizer on the server side to adapt learning rates according to gradient history.
FedBN+ (Federated Batch Normalization Plus)
Advanced extension of FedBN using hybrid local and global normalization statistics to balance generalization and personalization.
FedMLD (Federated Multi-Layer Distillation)
Distillation technique applied to multiple model layers to efficiently transfer knowledge between heterogeneous clients.
FedAMP (Federated Adaptive Multi-Proxy)
Method using multiple adaptive proxies to represent different client data distributions during aggregation.
FedRL (Federated Reinforcement Learning)
Aggregation paradigm specific to distributed reinforcement learning models combining optimal local policies.
FedCV (Federated Computer Vision)
Set of specialized aggregation algorithms for computer vision models processing distributed image data.