Acceleration for compressed gradient descent in distributed and federated optimization
Due to the high communication cost in distributed and federated learning problems,
methods relying on compression of communicated messages are becoming increasingly …
methods relying on compression of communicated messages are becoming increasingly …
SoteriaFL: A unified framework for private federated learning with communication compression
To enable large-scale machine learning in bandwidth-hungry environments such as
wireless networks, significant progress has been made recently in designing communication …
wireless networks, significant progress has been made recently in designing communication …
Elastic aggregation for federated optimization
Federated learning enables the privacy-preserving training of neural network models using
real-world data across distributed clients. FedAvg has become the preferred optimizer for …
real-world data across distributed clients. FedAvg has become the preferred optimizer for …
Recent theoretical advances in non-convex optimization
Motivated by recent increased interest in optimization algorithms for non-convex
optimization in application to training deep neural networks and other optimization problems …
optimization in application to training deep neural networks and other optimization problems …
EF21 with bells & whistles: Practical algorithmic extensions of modern error feedback
First proposed by Seide (2014) as a heuristic, error feedback (EF) is a very popular
mechanism for enforcing convergence of distributed gradient-based optimization methods …
mechanism for enforcing convergence of distributed gradient-based optimization methods …