Learning mixtures of gaussians using the DDPM objective
Recent works have shown that diffusion models can learn essentially any distribution
provided one can perform score estimation. Yet it remains poorly understood under what …
provided one can perform score estimation. Yet it remains poorly understood under what …
Private estimation algorithms for stochastic block models and mixture models
We introduce general tools for designing efficient private estimation algorithms, in the high-
dimensional settings, whose statistical guarantees almost match those of the best known …
dimensional settings, whose statistical guarantees almost match those of the best known …
Polynomial time and private learning of unbounded gaussian mixture models
J Arbas, H Ashtiani, C Liaw - International Conference on …, 2023 - proceedings.mlr.press
We study the problem of privately estimating the parameters of $ d $-dimensional Gaussian
Mixture Models (GMMs) with $ k $ components. For this, we develop a technique to reduce …
Mixture Models (GMMs) with $ k $ components. For this, we develop a technique to reduce …
SQ lower bounds for learning mixtures of separated and bounded covariance gaussians
We study the complexity of learning mixtures of separated Gaussians with common
unknown bounded covariance matrix. Specifically, we focus on learning Gaussian mixture …
unknown bounded covariance matrix. Specifically, we focus on learning Gaussian mixture …
Learning mixtures of gaussians using diffusion models
We give a new algorithm for learning mixtures of $ k $ Gaussians (with identity covariance in
$\mathbb {R}^ n $) to TV error $\varepsilon $, with quasi-polynomial ($ O (n^{\text {poly …
$\mathbb {R}^ n $) to TV error $\varepsilon $, with quasi-polynomial ($ O (n^{\text {poly …
Learning general gaussian mixtures with efficient score matching
We study the problem of learning mixtures of $ k $ Gaussians in $ d $ dimensions. We make
no separation assumptions on the underlying mixture components: we only require that the …
no separation assumptions on the underlying mixture components: we only require that the …
A fourier approach to mixture learning
We revisit the problem of learning mixtures of spherical Gaussians. Given samples from a
mixture $\frac {1}{k}\sum_ {j= 1}^{k}\mathcal {N}(\mu_j, I_d) $, the goal is to estimate the …
mixture $\frac {1}{k}\sum_ {j= 1}^{k}\mathcal {N}(\mu_j, I_d) $, the goal is to estimate the …
Efficient certificates of anti-concentration beyond gaussians
A set of high dimensional points X ={x_1,x_2,...,x_n\}⊆R^d in isotropic position is said to be
δ-anti concentrated if for every direction v, the fraction of points in X satisfying …
δ-anti concentrated if for every direction v, the fraction of points in X satisfying …
Mixtures of gaussians are privately learnable with a polynomial number of samples
We study the problem of estimating mixtures of Gaussians under the constraint of differential
privacy (DP). Our main result is that $\tilde {O}(k^ 2 d^ 4\log (1/\delta)/\alpha^ 2\varepsilon) …
privacy (DP). Our main result is that $\tilde {O}(k^ 2 d^ 4\log (1/\delta)/\alpha^ 2\varepsilon) …
Mixtures of Gaussians are Privately Learnable with a Polynomial Number of Samples
We study the problem of estimating mixtures of Gaussians under the constraint of differential
privacy (DP). Our main result is that $\text {poly}(k, d, 1/\alpha, 1/\varepsilon,\log (1/\delta)) …
privacy (DP). Our main result is that $\text {poly}(k, d, 1/\alpha, 1/\varepsilon,\log (1/\delta)) …