Fedpaq: A communication-efficient federated learning method with periodic averaging and quantization

A Reisizadeh, A Mokhtari, H Hassani… - International …, 2020 - proceedings.mlr.press
Federated learning is a distributed framework according to which a model is trained over a
set of devices, while kee** data localized. This framework faces several systems-oriented …

Communication compression techniques in distributed deep learning: A survey

Z Wang, M Wen, Y Xu, Y Zhou, JH Wang… - Journal of Systems …, 2023 - Elsevier
Nowadays, the training data and neural network models are getting increasingly large. The
training time of deep learning will become unbearably long on a single machine. To reduce …

Elasticflow: An elastic serverless training platform for distributed deep learning

D Gu, Y Zhao, Y Zhong, Y **ong, Z Han… - Proceedings of the 28th …, 2023 - dl.acm.org
This paper proposes ElasticFlow, an elastic serverless training platform for distributed deep
learning. ElasticFlow provides a serverless interface with two distinct features:(i) users …

An exact quantized decentralized gradient descent algorithm

A Reisizadeh, A Mokhtari, H Hassani… - IEEE Transactions on …, 2019 - ieeexplore.ieee.org
We consider the problem of decentralized consensus optimization, where the sum of n
smooth and strongly convex functions are minimized over n distributed agents that form a …

Robust and communication-efficient collaborative learning

A Reisizadeh, H Taheri, A Mokhtari… - Advances in …, 2019 - proceedings.neurips.cc
We consider a decentralized learning problem, where a set of computing nodes aim at
solving a non-convex optimization problem collaboratively. It is well-known that …

Quantization for decentralized learning under subspace constraints

R Nassif, S Vlaski, M Carpentiero… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
In this article, we consider decentralized optimization problems where agents have
individual cost functions to minimize subject to subspace constraints that require the …

Serverless federated auprc optimization for multi-party collaborative imbalanced data mining

X Wu, Z Hu, J Pei, H Huang - Proceedings of the 29th ACM SIGKDD …, 2023 - dl.acm.org
To address the big data challenges, serverless multi-party collaborative training has recently
attracted attention in the data mining community, since they can cut down the …

Double quantization for communication-efficient distributed optimization

Y Yu, J Wu, L Huang - Advances in neural information …, 2019 - proceedings.neurips.cc
Modern distributed training of machine learning models often suffers from high
communication overhead for synchronizing stochastic gradients and model parameters. In …

Finite-bit quantization for distributed algorithms with linear convergence

N Michelusi, G Scutari, CS Lee - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
This paper studies distributed algorithms for (strongly convex) composite optimization
problems over mesh networks, subject to quantized communications. Instead of focusing on …

Error-compensated sparsification for communication-efficient decentralized training in edge environment

H Wang, S Guo, Z Qu, R Li, Z Liu - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Communication has been considered as a major bottleneck in large-scale decentralized
training systems since participating nodes iteratively exchange large amounts of …