🏠 홈
벤치마크
📊 모든 벤치마크 🦖 공룡 v1 🦖 공룡 v2 ✅ 할 일 목록 앱 🎨 창의적인 자유 페이지 🎯 FSACB - 궁극의 쇼케이스 🌍 번역 벤치마크
모델
🏆 톱 10 모델 🆓 무료 모델 📋 모든 모델 ⚙️ 킬로 코드 모드
리소스
💬 프롬프트 라이브러리 📖 AI 용어 사전 🔗 유용한 링크

AI 용어집

인공지능 완전 사전

162
카테고리
2,032
하위 카테고리
23,060
용어
📖
용어

FedAvg (Federated Averaging)

Fundamental aggregation algorithm in federated learning that calculates the weighted average of local model weights based on client dataset sizes to create a global model.

📖
용어

FedProx (Federated Proximal)

Extension of FedAvg adding a proximal regularization term to constrain local updates to remain close to the global model, thereby reducing client heterogeneity.

📖
용어

SCAFFOLD (Stochastic Controlled Averaging)

Advanced aggregation algorithm using control variables to correct client-server drift biases and reduce the impact of data heterogeneity.

📖
용어

FedBN (Federated Batch Normalization)

Aggregation method maintaining local batch normalization statistics specific to each client while sharing other model parameters.

📖
용어

FedOpt (Federated Optimization)

Family of algorithms using advanced server optimizers (Adam, Yogi) to improve convergence in non-IID federated learning scenarios.

📖
용어

FedMA (Federated Matching Averaging)

Neural aggregation algorithm that matches and averages similar neurons between local models instead of directly aggregating weights.

📖
용어

FedNova (Federated Normalized Averaging)

Method normalizing local updates by the number of local optimization steps to correct aggregation biases in heterogeneous environments.

📖
용어

FedYogi

Adaptive optimizer for federated learning combining FedAvg with the Yogi algorithm for better adaptation to non-IID data distributions.

📖
용어

FedAdam

Variant of FedAvg incorporating the Adam optimizer on the server side to dynamically manage learning rates and improve convergence.

📖
용어

FedPer (Federated Personalization)

Architecture dividing the model into a global base and local personalization layers, allowing specific adaptation for each client.

📖
용어

FedRep (Federated Representation Learning)

Method separating the learning of representations (global) and classifiers (local) to optimize performance on heterogeneous data.

📖
용어

FedCurv (Federated Curvature)

Algorithm incorporating Fisher curvature information to improve aggregation in scenarios with strong client heterogeneity.

📖
용어

FedSGD (Federated Stochastic Gradient Descent)

Basic variant where clients perform a single gradient pass before aggregation, reducing local computation but increasing communication.

📖
용어

FedDist (Federated Distillation)

Aggregation method based on knowledge distillation where clients share their softmax outputs rather than model weights.

📖
용어

FedAdagrad

Combination of FedAvg with the Adagrad optimizer on the server side to adapt learning rates according to gradient history.

📖
용어

FedBN+ (Federated Batch Normalization Plus)

Advanced extension of FedBN using hybrid local and global normalization statistics to balance generalization and personalization.

📖
용어

FedMLD (Federated Multi-Layer Distillation)

Distillation technique applied to multiple model layers to efficiently transfer knowledge between heterogeneous clients.

📖
용어

FedAMP (Federated Adaptive Multi-Proxy)

Method using multiple adaptive proxies to represent different client data distributions during aggregation.

📖
용어

FedRL (Federated Reinforcement Learning)

Aggregation paradigm specific to distributed reinforcement learning models combining optimal local policies.

📖
용어

FedCV (Federated Computer Vision)

Set of specialized aggregation algorithms for computer vision models processing distributed image data.

🔍

결과를 찾을 수 없습니다