User-level differentially private learning via correlated sampling
Most works in learning with differential privacy (DP) have focused on the setting where each
user has a single sample. In this work, we consider the setting where each user holds $ m …
user has a single sample. In this work, we consider the setting where each user holds $ m …
Instance-optimal mean estimation under differential privacy
Mean estimation under differential privacy is a fundamental problem, but worst-case optimal
mechanisms do not offer meaningful utility guarantees in practice when the global sensitivity …
mechanisms do not offer meaningful utility guarantees in practice when the global sensitivity …
Samplable anonymous aggregation for private federated data analysis
We revisit the problem of designing scalable protocols for private statistics and private
federated learning when each device holds its private data. Locally differentially private …
federated learning when each device holds its private data. Locally differentially private …
Locally private k-means in one round
We provide an approximation algorithm for k-means clustering in the\emph {one-
round}(aka\emph {non-interactive}) local model of differential privacy (DP). Our algorithm …
round}(aka\emph {non-interactive}) local model of differential privacy (DP). Our algorithm …
Private non-convex federated learning without a trusted server
We study federated learning (FL) with non-convex loss functions and data from people who
do not trust the server or other silos. In this setting, each silo (eg hospital) must protect the …
do not trust the server or other silos. In this setting, each silo (eg hospital) must protect the …
PPML-Omics: a privacy-preserving federated machine learning method protects patients' privacy in omic data
Modern machine learning models toward various tasks with omic data analysis give rise to
threats of privacy leakage of patients involved in those datasets. Here, we proposed a …
threats of privacy leakage of patients involved in those datasets. Here, we proposed a …
Optimal unbiased randomizers for regression with label differential privacy
A Badanidiyuru Varadaraja, B Ghazi… - Advances in …, 2023 - proceedings.neurips.cc
We propose a new family of label randomizers for training regression models under the
constraint of label differential privacy (DP). In particular, we leverage the trade-offs between …
constraint of label differential privacy (DP). In particular, we leverage the trade-offs between …
Network shuffling: Privacy amplification via random walks
Recently, it is shown that shuffling can amplify the central differential privacy guarantees of
data randomized with local differential privacy. Within this setup, a centralized, trusted …
data randomized with local differential privacy. Within this setup, a centralized, trusted …
Differentially private federated learning with an adaptive noise mechanism
Federated Learning (FL) enables multiple distributed clients to collaboratively train a model
with owned datasets. To avoid the potential privacy threat in FL, researchers propose the DP …
with owned datasets. To avoid the potential privacy threat in FL, researchers propose the DP …
Shuffle differential private data aggregation for random population
Bridging the advantages of differential privacy in both centralized model (ie, high accuracy)
and local model (ie, minimum trust), the shuffle privacy model has potential applications in …
and local model (ie, minimum trust), the shuffle privacy model has potential applications in …