Lattice-based methods surpass sum-of-squares in clustering
Clustering is a fundamental primitive in unsupervised learning which gives rise to a rich
class of computationally-challenging inference tasks. In this work, we focus on the canonical …
class of computationally-challenging inference tasks. In this work, we focus on the canonical …
SQ lower bounds for learning mixtures of separated and bounded covariance gaussians
We study the complexity of learning mixtures of separated Gaussians with common
unknown bounded covariance matrix. Specifically, we focus on learning Gaussian mixture …
unknown bounded covariance matrix. Specifically, we focus on learning Gaussian mixture …
Computational and statistical thresholds in multi-layer stochastic block models
We study the problem of community recovery and detection in multi-layer stochastic block
models, focusing on the critical network density threshold for consistent community structure …
models, focusing on the critical network density threshold for consistent community structure …
Pseudo-labeling for kernel ridge regression under covariate shift
K Wang - arxiv preprint arxiv:2302.10160, 2023 - arxiv.org
We develop and analyze a principled approach to kernel ridge regression under covariate
shift. The goal is to learn a regression function with small mean squared error over a target …
shift. The goal is to learn a regression function with small mean squared error over a target …
Tensor-on-tensor regression: Riemannian optimization, over-parameterization, statistical-computational gap and their interplay
Tensor-on-tensor regression: Riemannian optimization, over-parameterization, statistical-computational
gap and their interplay Page 1 The Annals of Statistics 2024, Vol. 52, No. 6, 2583–2612 …
gap and their interplay Page 1 The Annals of Statistics 2024, Vol. 52, No. 6, 2583–2612 …
Leave-one-out singular subspace perturbation analysis for spectral clustering
In the supplement [46], we first provide the proof of Theorem 2.3 in Appendix A, followed by
the proofs of results of Section 3.4 in Appendix B. The proof of Theorem 3.3 is given in …
the proofs of results of Section 3.4 in Appendix B. The proof of Theorem 3.3 is given in …
Optimal spectral recovery of a planted vector in a subspace
Recovering a planted vector $ v $ in an $ n $-dimensional random subspace of $\mathbb
{R}^ N $ is a generic task related to many problems in machine learning and statistics, such …
{R}^ N $ is a generic task related to many problems in machine learning and statistics, such …
Sum-of-squares lower bounds for non-gaussian component analysis
Non-Gaussian Component Analysis (NGCA) is the statistical task of finding a non-Gaussian
direction in a high-dimensional dataset. Specifically, given iid samples from a distribution …
direction in a high-dimensional dataset. Specifically, given iid samples from a distribution …
Computational lower bounds for graphon estimation via low-degree polynomials
Y Luo, C Gao - The Annals of Statistics, 2024 - projecteuclid.org
Computational lower bounds for graphon estimation via low-degree polynomials Page 1 The
Annals of Statistics 2024, Vol. 52, No. 5, 2318–2348 https://doi.org/10.1214/24-AOS2437 © …
Annals of Statistics 2024, Vol. 52, No. 5, 2318–2348 https://doi.org/10.1214/24-AOS2437 © …
Statistical-computational trade-offs in tensor pca and related problems via communication complexity
Statistical-computational trade-offs in tensor PCA and related problems via communication
complexity Page 1 The Annals of Statistics 2024, Vol. 52, No. 1, 131–156 https://doi.org/10.1214/23-AOS2331 …
complexity Page 1 The Annals of Statistics 2024, Vol. 52, No. 1, 131–156 https://doi.org/10.1214/23-AOS2331 …