Score approximation, estimation and distribution recovery of diffusion models on low-dimensional data

M Chen, K Huang, T Zhao… - … Conference on Machine …, 2023 - proceedings.mlr.press
Diffusion models achieve state-of-the-art performance in various generation tasks. However,
their theoretical foundations fall far behind. This paper studies score approximation …

A survey on statistical theory of deep learning: Approximation, training dynamics, and generative models

N Suh, G Cheng - Annual Review of Statistics and Its Application, 2024 - annualreviews.org
In this article, we review the literature on statistical theories of neural networks from three
perspectives: approximation, training dynamics, and generative models. In the first part …

Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks

B Adcock, S Brugiapaglia, N Dexter… - arxiv preprint arxiv …, 2024 - arxiv.org
Learning approximations to smooth target functions of many variables from finite sets of
pointwise samples is an important task in scientific computing and its many applications in …

[HTML][HTML] Color image recovery using generalized matrix completion over higher-order finite dimensional algebra

L Liao, Z Guo, Q Gao, Y Wang, F Yu, Q Zhao… - Axioms, 2023 - mdpi.com
To improve the accuracy of color image completion with missing entries, we present a
recovery method based on generalized higher-order scalars. We extend the traditional …

Neural networks efficiently learn low-dimensional representations with sgd

A Mousavi-Hosseini, S Park, M Girotti… - arxiv preprint arxiv …, 2022 - arxiv.org
We study the problem of training a two-layer neural network (NN) of arbitrary width using
stochastic gradient descent (SGD) where the input $\boldsymbol {x}\in\mathbb {R}^ d $ is …

Provable guarantees for neural networks via gradient feature learning

Z Shi, J Wei, Y Liang - Advances in Neural Information …, 2023 - proceedings.neurips.cc
Neural networks have achieved remarkable empirical performance, while the current
theoretical analysis is not adequate for understanding their success, eg, the Neural Tangent …

Unveil conditional diffusion models with classifier-free guidance: A sharp statistical theory

H Fu, Z Yang, M Wang, M Chen - arxiv preprint arxiv:2403.11968, 2024 - arxiv.org
Conditional diffusion models serve as the foundation of modern image synthesis and find
extensive application in fields like computational biology and reinforcement learning. In …

Deep nonparametric regression on approximate manifolds: Nonasymptotic error bounds with polynomial prefactors

Y Jiao, G Shen, Y Lin, J Huang - The Annals of Statistics, 2023 - projecteuclid.org
Deep nonparametric regression on approximate manifolds: Nonasymptotic error bounds with
polynomial prefactors Page 1 The Annals of Statistics 2023, Vol. 51, No. 2, 691–716 …

Machine learning for elliptic PDEs: Fast rate generalization bound, neural scaling law and minimax optimality

Y Lu, H Chen, J Lu, L Ying, J Blanchet - arxiv preprint arxiv:2110.06897, 2021 - arxiv.org
In this paper, we study the statistical limits of deep learning techniques for solving elliptic
partial differential equations (PDEs) from random samples using the Deep Ritz Method …

A deep generative approach to conditional sampling

X Zhou, Y Jiao, J Liu, J Huang - Journal of the American Statistical …, 2023 - Taylor & Francis
We propose a deep generative approach to sampling from a conditional distribution based
on a unified formulation of conditional distribution and generalized nonparametric …