High-dimensional data bootstrap
This article reviews recent progress in high-dimensional bootstrap. We first review high-
dimensional central limit theorems for distributions of sample mean vectors over the …
dimensional central limit theorems for distributions of sample mean vectors over the …
[LIVRE][B] Analysis of boolean functions
R O'Donnell - 2014 - books.google.com
Boolean functions are perhaps the most basic objects of study in theoretical computer
science. They also arise in other areas of mathematics, including combinatorics, statistical …
science. They also arise in other areas of mathematics, including combinatorics, statistical …
Discriminative K-SVD for dictionary learning in face recognition
In a sparse-representation-based face recognition scheme, the desired dictionary should
have good representational power (ie, being able to span the subspace of all faces) while …
have good representational power (ie, being able to span the subspace of all faces) while …
Central limit theorems and bootstrap in high dimensions
This paper derives central limit and bootstrap theorems for probabilities that sums of
centered high-dimensional random vectors hit hyperrectangles and sparsely convex sets …
centered high-dimensional random vectors hit hyperrectangles and sparsely convex sets …
Comparison and anti-concentration bounds for maxima of Gaussian random vectors
Abstract Slepian and Sudakov–Fernique type inequalities, which compare expectations of
maxima of Gaussian random vectors under certain restrictions on the covariance matrices …
maxima of Gaussian random vectors under certain restrictions on the covariance matrices …
The optimality of polynomial regression for agnostic learning under gaussian marginals in the SQ model
We study the problem of agnostic learning under the Gaussian distribution in the Statistical
Query (SQ) model. We develop a method for finding hard families of examples for a wide …
Query (SQ) model. We develop a method for finding hard families of examples for a wide …
Recent progress and open problems in algorithmic convex geometry
SS Vempala - IARCS Annual Conference on Foundations of …, 2010 - drops.dagstuhl.de
This article is a survey of developments in algorithmic convex geometry over the past
decade. These include algorithms for sampling, optimization, integration, rounding and …
decade. These include algorithms for sampling, optimization, integration, rounding and …
A moment-matching approach to testable learning and a new characterization of rademacher complexity
A remarkable recent paper by Rubinfeld and Vasilyan (2022) initiated the study of testable
learning, where the goal is to replace hard-to-verify distributional assumptions (such as …
learning, where the goal is to replace hard-to-verify distributional assumptions (such as …
Learning deep relu networks is fixed-parameter tractable
We consider the problem of learning an unknown ReLU network with respect to Gaussian
inputs and obtain the first nontrivial results for networks of depth more than two. We give an …
inputs and obtain the first nontrivial results for networks of depth more than two. We give an …
Learning mixtures of gaussians using diffusion models
We give a new algorithm for learning mixtures of $ k $ Gaussians (with identity covariance in
$\mathbb {R}^ n $) to TV error $\varepsilon $, with quasi-polynomial ($ O (n^{\text {poly …
$\mathbb {R}^ n $) to TV error $\varepsilon $, with quasi-polynomial ($ O (n^{\text {poly …