Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints

F Sattler, KR Müller, W Samek - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
Federated learning (FL) is currently the most widely adopted framework for collaborative
training of (deep) machine learning models under privacy constraints. Albeit its popularity, it …

[PDF][PDF] Communication-Efficient Stochastic Gradient Descent Ascent with Momentum Algorithms.

Y Zhang, M Qiu, H Gao - IJCAI, 2023 - ijcai.org
Numerous machine learning models can be formulated as a stochastic minimax optimization
problem, such as imbalanced data classification with AUC maximization. Develo** …

Federated optimization in heterogeneous networks

T Li, AK Sahu, M Zaheer, M Sanjabi… - … of Machine learning …, 2020 - proceedings.mlsys.org
Federated Learning is a distributed learning paradigm with two key challenges that
differentiate it from traditional distributed optimization:(1) significant variability in terms of the …

Communication-efficient distributed learning: An overview

X Cao, T Başar, S Diggavi, YC Eldar… - IEEE journal on …, 2023 - ieeexplore.ieee.org
Distributed learning is envisioned as the bedrock of next-generation intelligent networks,
where intelligent agents, such as mobile devices, robots, and sensors, exchange information …

Model pruning enables efficient federated learning on edge devices

Y Jiang, S Wang, V Valls, BJ Ko… - … on Neural Networks …, 2022 - ieeexplore.ieee.org
Federated learning (FL) allows model training from local data collected by edge/mobile
devices while preserving data privacy, which has wide applicability to image and vision …

Tighter theory for local SGD on identical and heterogeneous data

A Khaled, K Mishchenko… - … conference on artificial …, 2020 - proceedings.mlr.press
We provide a new analysis of local SGD, removing unnecessary assumptions and
elaborating on the difference between two data regimes: identical and heterogeneous. In …