Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Score approximation, estimation and distribution recovery of diffusion models on low-dimensional data
Diffusion models achieve state-of-the-art performance in various generation tasks. However,
their theoretical foundations fall far behind. This paper studies score approximation …
their theoretical foundations fall far behind. This paper studies score approximation …
A survey on statistical theory of deep learning: Approximation, training dynamics, and generative models
In this article, we review the literature on statistical theories of neural networks from three
perspectives: approximation, training dynamics, and generative models. In the first part …
perspectives: approximation, training dynamics, and generative models. In the first part …
Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks
Learning approximations to smooth target functions of many variables from finite sets of
pointwise samples is an important task in scientific computing and its many applications in …
pointwise samples is an important task in scientific computing and its many applications in …
[HTML][HTML] Color image recovery using generalized matrix completion over higher-order finite dimensional algebra
L Liao, Z Guo, Q Gao, Y Wang, F Yu, Q Zhao… - Axioms, 2023 - mdpi.com
To improve the accuracy of color image completion with missing entries, we present a
recovery method based on generalized higher-order scalars. We extend the traditional …
recovery method based on generalized higher-order scalars. We extend the traditional …
Neural networks efficiently learn low-dimensional representations with sgd
We study the problem of training a two-layer neural network (NN) of arbitrary width using
stochastic gradient descent (SGD) where the input $\boldsymbol {x}\in\mathbb {R}^ d $ is …
stochastic gradient descent (SGD) where the input $\boldsymbol {x}\in\mathbb {R}^ d $ is …
Provable guarantees for neural networks via gradient feature learning
Neural networks have achieved remarkable empirical performance, while the current
theoretical analysis is not adequate for understanding their success, eg, the Neural Tangent …
theoretical analysis is not adequate for understanding their success, eg, the Neural Tangent …
Unveil conditional diffusion models with classifier-free guidance: A sharp statistical theory
Conditional diffusion models serve as the foundation of modern image synthesis and find
extensive application in fields like computational biology and reinforcement learning. In …
extensive application in fields like computational biology and reinforcement learning. In …
Deep nonparametric regression on approximate manifolds: Nonasymptotic error bounds with polynomial prefactors
Deep nonparametric regression on approximate manifolds: Nonasymptotic error bounds with
polynomial prefactors Page 1 The Annals of Statistics 2023, Vol. 51, No. 2, 691–716 …
polynomial prefactors Page 1 The Annals of Statistics 2023, Vol. 51, No. 2, 691–716 …
Machine learning for elliptic PDEs: Fast rate generalization bound, neural scaling law and minimax optimality
In this paper, we study the statistical limits of deep learning techniques for solving elliptic
partial differential equations (PDEs) from random samples using the Deep Ritz Method …
partial differential equations (PDEs) from random samples using the Deep Ritz Method …
A deep generative approach to conditional sampling
We propose a deep generative approach to sampling from a conditional distribution based
on a unified formulation of conditional distribution and generalized nonparametric …
on a unified formulation of conditional distribution and generalized nonparametric …