Distributed learning in wireless networks: Recent progress and future challenges

M Chen, D Gündüz, K Huang, W Saad… - IEEE Journal on …, 2021 - ieeexplore.ieee.org
The next-generation of wireless networks will enable many machine learning (ML) tools and
applications to efficiently analyze various types of data collected by edge devices for …

Communication-efficient distributed learning: An overview

X Cao, T Başar, S Diggavi, YC Eldar… - IEEE journal on …, 2023 - ieeexplore.ieee.org
Distributed learning is envisioned as the bedrock of next-generation intelligent networks,
where intelligent agents, such as mobile devices, robots, and sensors, exchange information …

Communication-efficient federated learning

M Chen, N Shlezinger, HV Poor… - Proceedings of the …, 2021 - National Acad Sciences
Federated learning (FL) enables edge devices, such as Internet of Things devices (eg,
sensors), servers, and institutions (eg, hospitals), to collaboratively train a machine learning …

Decentralized federated averaging

T Sun, D Li, B Wang - IEEE Transactions on Pattern Analysis …, 2022 - ieeexplore.ieee.org
Federated averaging (FedAvg) is a communication-efficient algorithm for distributed training
with an enormous number of clients. In FedAvg, clients keep their data locally for privacy …

A survey of federated learning for edge computing: Research problems and solutions

Q **a, W Ye, Z Tao, J Wu, Q Li - High-Confidence Computing, 2021 - Elsevier
Federated Learning is a machine learning scheme in which a shared prediction model can
be collaboratively learned by a number of distributed nodes using their locally stored data. It …

Fairness-aware agnostic federated learning

W Du, D Xu, X Wu, H Tong - Proceedings of the 2021 SIAM International …, 2021 - SIAM
Federated learning is an emerging framework that builds centralized machine learning
models with training data distributed across multiple devices. Most of the previous works …

Adaptive quantization of model updates for communication-efficient federated learning

D Jhunjhunwala, A Gadhikar, G Joshi… - ICASSP 2021-2021 …, 2021 - ieeexplore.ieee.org
Communication of model updates between client nodes and the central aggregating server
is a major bottleneck in federated learning, especially in bandwidth-limited settings and high …

Socialized learning: A survey of the paradigm shift for edge intelligence in networked systems

X Wang, Y Zhao, C Qiu, Q Hu… - … Surveys & Tutorials, 2024 - ieeexplore.ieee.org
Amidst the robust impetus from artificial intelligence (AI) and big data, edge intelligence (EI)
has emerged as a nascent computing paradigm, synthesizing AI with edge computing (EC) …

Communication-efficient distributed deep learning: A comprehensive survey

Z Tang, S Shi, W Wang, B Li, X Chu - arxiv preprint arxiv:2003.06307, 2020 - arxiv.org
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …

1-bit adam: Communication efficient large-scale training with adam's convergence speed

H Tang, S Gan, AA Awan… - International …, 2021 - proceedings.mlr.press
Scalable training of large models (like BERT and GPT-3) requires careful optimization
rooted in model design, architecture, and system capabilities. From a system standpoint …