Communication-efficient distributed deep learning: A comprehensive survey
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
Edge learning: The enabling technology for distributed big data analytics in the edge
Machine Learning (ML) has demonstrated great promise in various fields, eg, self-driving,
smart city, which are fundamentally altering the way individuals and organizations live, work …
smart city, which are fundamentally altering the way individuals and organizations live, work …
Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Advances and open problems in federated learning
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …
devices or whole organizations) collaboratively train a model under the orchestration of a …
Federated learning over wireless fading channels
We study federated machine learning at the wireless network edge, where limited power
wireless devices, each with its own dataset, build a joint model with the help of a remote …
wireless devices, each with its own dataset, build a joint model with the help of a remote …
Machine learning at the wireless edge: Distributed stochastic gradient descent over-the-air
We study federated machine learning (ML) at the wireless edge, where power-and
bandwidth-limited wireless devices with local datasets carry out distributed stochastic …
bandwidth-limited wireless devices with local datasets carry out distributed stochastic …
Decentralized stochastic optimization and gossip algorithms with compressed communication
We consider decentralized stochastic optimization with the objective function (eg data
samples for machine learning tasks) being distributed over n machines that can only …
samples for machine learning tasks) being distributed over n machines that can only …
Expanding the reach of federated learning by reducing client resource requirements
Communication on heterogeneous edge networks is a fundamental bottleneck in Federated
Learning (FL), restricting both model capacity and user participation. To address this issue …
Learning (FL), restricting both model capacity and user participation. To address this issue …
Gradient sparsification for communication-efficient distributed optimization
Modern large-scale machine learning applications require stochastic optimization
algorithms to be implemented on distributed computational architectures. A key bottleneck is …
algorithms to be implemented on distributed computational architectures. A key bottleneck is …
cpSGD: Communication-efficient and differentially-private distributed SGD
Distributed stochastic gradient descent is an important subroutine in distributed learning. A
setting of particular interest is when the clients are mobile devices, where two important …
setting of particular interest is when the clients are mobile devices, where two important …