Distributed learning for wireless communications: Methods, applications and challenges

L Qian, P Yang, M **ao, OA Dobre… - IEEE Journal of …, 2022 - ieeexplore.ieee.org
With its privacy-preserving and decentralized features, distributed learning plays an
irreplaceable role in the era of wireless networks with a plethora of smart terminals, an …

Collective intelligence using 5G: Concepts, applications, and challenges in sociotechnical environments

A Narayanan, MS Korium, DC Melgarejo… - IEEE …, 2022 - ieeexplore.ieee.org
Distributed intelligence is a well-known approach for optimizing interactions among
numerous smart devices that interconnect and operate together as Internet of Things (IoT) …

Exploiting heterogeneity in robust federated best-arm identification

A Mitra, H Hassani, G Pappas - arxiv preprint arxiv:2109.05700, 2021 - arxiv.org
We study a federated variant of the best-arm identification problem in stochastic multi-armed
bandits: a set of clients, each of whom can sample only a subset of the arms, collaborate via …

Collaborative linear bandits with adversarial agents: Near-optimal regret bounds

A Mitra, A Adibi, GJ Pappas… - Advances in neural …, 2022 - proceedings.neurips.cc
We consider a linear stochastic bandit problem involving $ M $ agents that can collaborate
via a central server to minimize regret. A fraction $\alpha $ of these agents are adversarial …

H-nobs: Achieving certified fairness and robustness in distributed learning on heterogeneous datasets

G Zhou, P Xu, Y Wang, Z Tian - Advances in Neural …, 2023 - proceedings.neurips.cc
Fairness and robustness are two important goals in the design of modern distributed
learning systems. Despite a few prior works attempting to achieve both fairness and …

Byzantine-robust and communication-efficient distributed non-convex learning over non-IID data

X He, H Zhu, Q Ling - ICASSP 2022-2022 IEEE International …, 2022 - ieeexplore.ieee.org
Motivated by the emerging federated learning applications, we jointly consider the problems
of Byzantine-robustness and communication efficiency in distributed non-convex learning …

Localnewton: Reducing communication bottleneck for distributed learning

V Gupta, A Ghosh, M Derezinski, R Khanna… - arxiv preprint arxiv …, 2021 - arxiv.org
To address the communication bottleneck problem in distributed optimization within a
master-worker framework, we propose LocalNewton, a distributed second-order algorithm …

C-RSA: Byzantine-robust and communication-efficient distributed learning in the non-convex and non-IID regime

X He, H Zhu, Q Ling - Signal Processing, 2023 - Elsevier
The emerging federated learning applications raise challenges of Byzantine-robustness and
communication efficiency in distributed non-convex learning over non-IID data. To address …

LocalNewton: Reducing communication rounds for distributed learning

V Gupta, A Ghosh, M Dereziński… - Uncertainty in …, 2021 - proceedings.mlr.press
To address the communication bottleneck problem in distributed optimization within a
master-worker framework, we propose LocalNewton, a distributed second-order algorithm …

Federated learning in the presence of adversarial client unavailability

L Su, M **ang, J Xu, P Yang - arxiv preprint arxiv:2305.19971, 2023 - arxiv.org
Federated learning is a decentralized machine learning framework that enables
collaborative model training without revealing raw data. Due to the diverse hardware and …