Score approximation, estimation and distribution recovery of diffusion models on low-dimensional data
Diffusion models achieve state-of-the-art performance in various generation tasks. However,
their theoretical foundations fall far behind. This paper studies score approximation …
their theoretical foundations fall far behind. This paper studies score approximation …
Opportunities and challenges of diffusion models for generative AI
Diffusion models, a powerful and universal generative artificial intelligence technology, have
achieved tremendous success and opened up new possibilities in diverse applications. In …
achieved tremendous success and opened up new possibilities in diverse applications. In …
Universal approximation with deep narrow networks
P Kidger, T Lyons - Conference on learning theory, 2020 - proceedings.mlr.press
Abstract The classical Universal Approximation Theorem holds for neural networks of
arbitrary width and bounded depth. Here we consider the natural 'dual'scenario for networks …
arbitrary width and bounded depth. Here we consider the natural 'dual'scenario for networks …
The modern mathematics of deep learning
We describe the new field of the mathematical analysis of deep learning. This field emerged
around a list of research questions that were not answered within the classical framework of …
around a list of research questions that were not answered within the classical framework of …
Deep network approximation for smooth functions
This paper establishes the optimal approximation error characterization of deep rectified
linear unit (ReLU) networks for smooth functions in terms of both width and depth …
linear unit (ReLU) networks for smooth functions in terms of both width and depth …
A survey on statistical theory of deep learning: Approximation, training dynamics, and generative models
In this article, we review the literature on statistical theories of neural networks from three
perspectives: approximation, training dynamics, and generative models. In the first part …
perspectives: approximation, training dynamics, and generative models. In the first part …
Provable guarantees for neural networks via gradient feature learning
Neural networks have achieved remarkable empirical performance, while the current
theoretical analysis is not adequate for understanding their success, eg, the Neural Tangent …
theoretical analysis is not adequate for understanding their success, eg, the Neural Tangent …
Neural network approximation: Three hidden layers are enough
A three-hidden-layer neural network with super approximation power is introduced. This
network is built with the floor function (⌊ x⌋), the exponential function (2 x), the step function …
network is built with the floor function (⌊ x⌋), the exponential function (2 x), the step function …
Nonparametric regression on low-dimensional manifolds using deep ReLU networks: Function approximation and statistical recovery
Real-world data often exhibit low-dimensional geometric structures and can be viewed as
samples near a low-dimensional manifold. This paper studies nonparametric regression of …
samples near a low-dimensional manifold. This paper studies nonparametric regression of …
[PDF][PDF] Deep network approximation: Beyond relu to diverse activation functions
This paper explores the expressive power of deep neural networks for a diverse range of
activation functions. An activation function set A is defined to encompass the majority of …
activation functions. An activation function set A is defined to encompass the majority of …