Quantitative clts in deep neural networks
We study the distribution of a fully connected neural network with random Gaussian weights
and biases in which the hidden layer widths are proportional to a large constant n. Under …
and biases in which the hidden layer widths are proportional to a large constant n. Under …
Gaussian random field approximation via Stein's method with applications to wide random neural networks
We derive upper bounds on the Wasserstein distance (W 1), with respect to sup-norm,
between any continuous R d valued random field indexed by the n-sphere and the …
between any continuous R d valued random field indexed by the n-sphere and the …
Quantitative central limit theorems for the parabolic Anderson model driven by colored noises
In this paper, we study the spatial averages of the solution to the parabolic Anderson model
driven by a space-time Gaussian homogeneous noise that is colored in both time and …
driven by a space-time Gaussian homogeneous noise that is colored in both time and …
The hyperbolic Anderson model: moment estimates of the Malliavin derivatives and applications
In this article, we study the hyperbolic Anderson model driven by a space-time colored
Gaussian homogeneous noise with spatial dimension d= 1, 2. Under mild assumptions, we …
Gaussian homogeneous noise with spatial dimension d= 1, 2. Under mild assumptions, we …
Quantitative CLTs on the Poisson space via Skorohod estimates and -Poincar\'e inequalities
T Trauthwein - arxiv preprint arxiv:2212.03782, 2022 - arxiv.org
We establish new explicit bounds on the Gaussian approximation of Poisson functionals
based on novel estimates of moments of Skorohod integrals. Combining these with the …
based on novel estimates of moments of Skorohod integrals. Combining these with the …
[PDF][PDF] Non-asymptotic approximations of Gaussian neural networks via second-order Poincaré inequalities
There is a recent and growing literature on large-width asymptotic and non-asymptotic
properties of deep Gaussian neural networks (NNs), namely NNs with weights initialized as …
properties of deep Gaussian neural networks (NNs), namely NNs with weights initialized as …
Almost sure central limit theorems for parabolic/hyperbolic Anderson models with Gaussian colored noises
This short note is devoted to establishing the almost sure central limit theorem for the
parabolic/hyperbolic Anderson models driven by colored-in-time Gaussian noises …
parabolic/hyperbolic Anderson models driven by colored-in-time Gaussian noises …
Hyperbolic Anderson model with Lévy white noise: Spatial ergodicity and fluctuation
In this paper, we study one-dimensional hyperbolic Anderson models (HAM) driven by
space-time pure-jump Lévy white noise in a finite-variance setting. Motivated by recent …
space-time pure-jump Lévy white noise in a finite-variance setting. Motivated by recent …
An MMSE lower bound via Poincare inequality
This paper studies the minimum mean squared error (MMSE) of estimating X∈ ℝ d from the
noisy observation Y∈ ℝ k, under the assumption that the noise (ie, Y| X) is a member of the …
noisy observation Y∈ ℝ k, under the assumption that the noise (ie, Y| X) is a member of the …
Limit theorems for Gaussian fields via Chaos Expansions and Applications
G Giorgio - arxiv preprint arxiv:2406.15801, 2024 - arxiv.org
In this PhD thesis, we apply a combination of Malliavin calculus and Stein's method in the
framework of probability approximations. The specific problems we tackle with these …
framework of probability approximations. The specific problems we tackle with these …