Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Momentum provably improves error feedback!
Due to the high communication overhead when training machine learning models in a
distributed environment, modern algorithms invariably rely on lossy communication …
distributed environment, modern algorithms invariably rely on lossy communication …
EF21-P and friends: Improved theoretical communication complexity for distributed optimization with bidirectional compression
In this work we focus our attention on distributed optimization problems in the context where
the communication time between the server and the workers is non-negligible. We obtain …
the communication time between the server and the workers is non-negligible. We obtain …
Lower bounds and nearly optimal algorithms in distributed learning with communication compression
Recent advances in distributed optimization and learning have shown that communication
compression is one of the most effective means of reducing communication. While there …
compression is one of the most effective means of reducing communication. While there …
Communication acceleration of local gradient methods via an accelerated primal-dual algorithm with an inexact prox
Inspired by a recent breakthrough of Mishchenko et al.[2022], who for the first time showed
that local gradient steps can lead to provable communication acceleration, we propose an …
that local gradient steps can lead to provable communication acceleration, we propose an …
Queuing dynamics of asynchronous Federated Learning
We study asynchronous federated learning mechanisms with nodes having potentially
different computational speeds. In such an environment, each node is allowed to work on …
different computational speeds. In such an environment, each node is allowed to work on …
Achieving lossless gradient sparsification via map** to alternative space in federated learning
Handling the substantial communication burden in federated learning (FL) still remains a
significant challenge. Although recent studies have attempted to compress the local …
significant challenge. Although recent studies have attempted to compress the local …
Anchor sampling for federated learning with partial client participation
Compared with full client participation, partial client participation is a more practical scenario
in federated learning, but it may amplify some challenges in federated learning, such as data …
in federated learning, but it may amplify some challenges in federated learning, such as data …
2Direction: Theoretically faster distributed training with bidirectional communication compression
We consider distributed convex optimization problems in the regime when the
communication between the server and the workers is expensive in both uplink and …
communication between the server and the workers is expensive in both uplink and …
On the convergence of fedprox with extrapolation and inexact prox
Enhancing the FedProx federated learning algorithm (Li et al., 2020) with server-side
extrapolation, Li et al.(2024a) recently introduced the FedExProx method. Their theoretical …
extrapolation, Li et al.(2024a) recently introduced the FedExProx method. Their theoretical …
Improving the worst-case bidirectional communication complexity for nonconvex distributed optimization under function similarity
Effective communication between the server and workers plays a key role in distributed
optimization. In this paper, we focus on optimizing the server-to-worker communication …
optimization. In this paper, we focus on optimizing the server-to-worker communication …