A survey for federated learning evaluations: Goals and measures
Evaluation is a systematic approach to assessing how well a system achieves its intended
purpose. Federated learning (FL) is a novel paradigm for privacy-preserving machine …
purpose. Federated learning (FL) is a novel paradigm for privacy-preserving machine …
Stochastic controlled averaging for federated learning with communication compression
Communication compression, a technique aiming to reduce the information volume to be
transmitted over the air, has gained great interests in Federated Learning (FL) for the …
transmitted over the air, has gained great interests in Federated Learning (FL) for the …
Federated learning: Challenges, SoTA, performance improvements and application domains
Federated Learning has emerged as a revolutionary technology in Machine Learning (ML),
enabling collaborative training of models in a distributed environment while ensuring privacy …
enabling collaborative training of models in a distributed environment while ensuring privacy …
Nonlinear perturbation-based non-convex optimization over time-varying networks
M Doostmohammadian, ZR Gabidullina… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Decentralized optimization strategies are helpful for various applications, from networked
estimation to distributed machine learning. This paper studies finite-sum minimization …
estimation to distributed machine learning. This paper studies finite-sum minimization …
Bandwidth-Aware and Overlap-Weighted Compression for Communication-Efficient Federated Learning
Current data compression methods, such as sparsification in Federated Averaging
(FedAvg), effectively enhance the communication efficiency of Federated Learning (FL) …
(FedAvg), effectively enhance the communication efficiency of Federated Learning (FL) …
Achieving dimension-free communication in federated learning via zeroth-order optimization
Federated Learning (FL) offers a promising framework for collaborative and privacy-
preserving machine learning across distributed data sources. However, the substantial …
preserving machine learning across distributed data sources. However, the substantial …
Byzantine-resilient Federated Learning Employing Normalized Gradients on Non-IID Datasets
In practical federated learning (FL) systems, the presence of malicious Byzantine attacks
and data heterogeneity often introduces biases into the learning process. However, existing …
and data heterogeneity often introduces biases into the learning process. However, existing …
SignSGD with Federated Defense: Harnessing Adversarial Attacks through Gradient Sign Decoding
Distributed learning is an effective approach to accelerate model training using multiple
workers. However, substantial communication delays emerge between workers and a …
workers. However, substantial communication delays emerge between workers and a …
Sequential Federated Learning in Hierarchical Architecture on Non-IID Datasets
In a real federated learning (FL) system, communication overhead for passing model
parameters between the clients and the parameter server (PS) is often a bottleneck …
parameters between the clients and the parameter server (PS) is often a bottleneck …
Mask-Encoded Sparsification: Mitigating Biased Gradients in Communication-Efficient Split Learning
This paper introduces a novel framework designed to achieve a high compression ratio in
Split Learning (SL) scenarios where resource-constrained devices are involved in large …
Split Learning (SL) scenarios where resource-constrained devices are involved in large …