Distributed learning in wireless networks: Recent progress and future challenges
The next-generation of wireless networks will enable many machine learning (ML) tools and
applications to efficiently analyze various types of data collected by edge devices for …
applications to efficiently analyze various types of data collected by edge devices for …
A survey on distributed machine learning
J Verbraeken, M Wolting, J Katzy… - Acm computing surveys …, 2020 - dl.acm.org
The demand for artificial intelligence has grown significantly over the past decade, and this
growth has been fueled by advances in machine learning techniques and the ability to …
growth has been fueled by advances in machine learning techniques and the ability to …
Federated learning on non-IID data: A survey
Federated learning is an emerging distributed machine learning framework for privacy
preservation. However, models trained in federated learning usually have worse …
preservation. However, models trained in federated learning usually have worse …
Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Cocktailsgd: Fine-tuning foundation models over 500mbps networks
Distributed training of foundation models, especially large language models (LLMs), is
communication-intensive and so has heavily relied on centralized data centers with fast …
communication-intensive and so has heavily relied on centralized data centers with fast …
Convergence of edge computing and deep learning: A comprehensive survey
Ubiquitous sensors and smart devices from factories and communities are generating
massive amounts of data, and ever-increasing computing power is driving the core of …
massive amounts of data, and ever-increasing computing power is driving the core of …
Fedpaq: A communication-efficient federated learning method with periodic averaging and quantization
Federated learning is a distributed framework according to which a model is trained over a
set of devices, while kee** data localized. This framework faces several systems-oriented …
set of devices, while kee** data localized. This framework faces several systems-oriented …
PipeDream: Generalized pipeline parallelism for DNN training
DNN training is extremely time-consuming, necessitating efficient multi-accelerator
parallelization. Current approaches to parallelizing training primarily use intra-batch …
parallelization. Current approaches to parallelizing training primarily use intra-batch …