Learning mixtures of gaussians using the DDPM objective

K Shah, S Chen, A Klivans - Advances in Neural …, 2023 - proceedings.neurips.cc
Recent works have shown that diffusion models can learn essentially any distribution
provided one can perform score estimation. Yet it remains poorly understood under what …

Private estimation algorithms for stochastic block models and mixture models

H Chen, V Cohen-Addad, T d'Orsi… - Advances in …, 2023 - proceedings.neurips.cc
We introduce general tools for designing efficient private estimation algorithms, in the high-
dimensional settings, whose statistical guarantees almost match those of the best known …

Polynomial time and private learning of unbounded gaussian mixture models

J Arbas, H Ashtiani, C Liaw - International Conference on …, 2023 - proceedings.mlr.press
We study the problem of privately estimating the parameters of $ d $-dimensional Gaussian
Mixture Models (GMMs) with $ k $ components. For this, we develop a technique to reduce …

SQ lower bounds for learning mixtures of separated and bounded covariance gaussians

I Diakonikolas, DM Kane, T Pittas… - The Thirty Sixth …, 2023 - proceedings.mlr.press
We study the complexity of learning mixtures of separated Gaussians with common
unknown bounded covariance matrix. Specifically, we focus on learning Gaussian mixture …

Learning mixtures of gaussians using diffusion models

K Gatmiry, J Kelner, H Lee - arxiv preprint arxiv:2404.18869, 2024 - arxiv.org
We give a new algorithm for learning mixtures of $ k $ Gaussians (with identity covariance in
$\mathbb {R}^ n $) to TV error $\varepsilon $, with quasi-polynomial ($ O (n^{\text {poly …

Learning general gaussian mixtures with efficient score matching

S Chen, V Kontonis, K Shah - arxiv preprint arxiv:2404.18893, 2024 - arxiv.org
We study the problem of learning mixtures of $ k $ Gaussians in $ d $ dimensions. We make
no separation assumptions on the underlying mixture components: we only require that the …

A fourier approach to mixture learning

M Qiao, G Guruganesh, A Rawat… - Advances in …, 2022 - proceedings.neurips.cc
We revisit the problem of learning mixtures of spherical Gaussians. Given samples from a
mixture $\frac {1}{k}\sum_ {j= 1}^{k}\mathcal {N}(\mu_j, I_d) $, the goal is to estimate the …

Efficient certificates of anti-concentration beyond gaussians

A Bakshi, PK Kothari, G Rajendran… - 2024 IEEE 65th …, 2024 - ieeexplore.ieee.org
A set of high dimensional points X ={x_1,x_2,...,x_n\}⊆R^d in isotropic position is said to be
δ-anti concentrated if for every direction v, the fraction of points in X satisfying …

Mixtures of gaussians are privately learnable with a polynomial number of samples

M Afzali, H Ashtiani, C Liaw - arxiv preprint arxiv:2309.03847, 2023 - arxiv.org
We study the problem of estimating mixtures of Gaussians under the constraint of differential
privacy (DP). Our main result is that $\tilde {O}(k^ 2 d^ 4\log (1/\delta)/\alpha^ 2\varepsilon) …

Mixtures of Gaussians are Privately Learnable with a Polynomial Number of Samples

M Afzali, H Ashtiani, C Liaw - International Conference on …, 2024 - proceedings.mlr.press
We study the problem of estimating mixtures of Gaussians under the constraint of differential
privacy (DP). Our main result is that $\text {poly}(k, d, 1/\alpha, 1/\varepsilon,\log (1/\delta)) …