Advances and open problems in federated learning

P Kairouz, HB McMahan, B Avent… - … and trends® in …, 2021 - nowpublishers.com
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …

Making ai forget you: Data deletion in machine learning

A Ginart, M Guan, G Valiant… - Advances in neural …, 2019 - proceedings.neurips.cc
Intense recent discussions have focused on how to provide individuals with control over
when their data can and cannot be used---the EU's Right To Be Forgotten regulation is an …

Communication-efficient distributed statistical inference

MI Jordan, JD Lee, Y Yang - Journal of the American Statistical …, 2019 - Taylor & Francis
We present a communication-efficient surrogate likelihood (CSL) framework for solving
distributed statistical inference problems. CSL provides a communication-efficient surrogate …

Topics and techniques in distribution testing: A biased but representative sample

CL Canonne - Foundations and Trends® in Communications …, 2022 - nowpublishers.com
We focus on some specific problems in distribution testing, taking goodness-of-fit as a
running example. In particular, we do not aim to provide a comprehensive summary of all the …

Distributed mean estimation with limited communication

AT Suresh, XY Felix, S Kumar… - … on machine learning, 2017 - proceedings.mlr.press
Motivated by the need for distributed learning and optimization algorithms with low
communication cost, we study communication efficient algorithms for distributed mean …

Learning with user-level privacy

D Levy, Z Sun, K Amin, S Kale… - Advances in …, 2021 - proceedings.neurips.cc
We propose and analyze algorithms to solve a range of learning tasks under user-level
differential privacy constraints. Rather than guaranteeing only the privacy of individual …

Communication-efficient sparse regression

JD Lee, Q Liu, Y Sun, JE Taylor - Journal of Machine Learning Research, 2017 - jmlr.org
We devise a communication-efficient approach to distributed sparse regression in the high-
dimensional setting. The key idea is to average" debiased" or" desparsified" lasso …

Privacy amplification via compression: Achieving the optimal privacy-accuracy-communication trade-off in distributed mean estimation

WN Chen, D Song, A Ozgur… - Advances in Neural …, 2023 - proceedings.neurips.cc
Privacy and communication constraints are two major bottlenecks in federated learning (FL)
and analytics (FA). We study the optimal accuracy of mean and frequency estimation …

Breaking the communication-privacy-accuracy trilemma

WN Chen, P Kairouz, A Ozgur - Advances in Neural …, 2020 - proceedings.neurips.cc
Two major challenges in distributed learning and estimation are 1) preserving the privacy of
the local samples; and 2) communicating them efficiently to a central server, while achieving …

Communication-efficient federated learning via optimal client sampling

M Ribero, H Vikalo - arxiv preprint arxiv:2007.15197, 2020 - arxiv.org
Federated learning (FL) ameliorates privacy concerns in settings where a central server
coordinates learning from data distributed across many clients. The clients train locally and …