[HTML][HTML] A review of uncertainty quantification in deep learning: Techniques, applications and challenges
Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of
uncertainties during both optimization and decision making processes. They have been …
uncertainties during both optimization and decision making processes. They have been …
When Gaussian process meets big data: A review of scalable GPs
The vast quantity of information brought by big data as well as the evolving computer
hardware encourages success stories in the machine learning community. In the …
hardware encourages success stories in the machine learning community. In the …
On exact computation with an infinitely wide neural net
How well does a classic deep net architecture like AlexNet or VGG19 classify on a standard
dataset such as CIFAR-10 when its “width”—namely, number of channels in convolutional …
dataset such as CIFAR-10 when its “width”—namely, number of channels in convolutional …
Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data
Surrogate modeling and uncertainty quantification tasks for PDE systems are most often
considered as supervised learning problems where input and output data pairs are used for …
considered as supervised learning problems where input and output data pairs are used for …
Bayesian neural networks: An introduction and survey
Abstract Neural Networks (NNs) have provided state-of-the-art results for many challenging
machine learning tasks such as detection, regression and classification across the domains …
machine learning tasks such as detection, regression and classification across the domains …
Scaling limits of wide neural networks with weight sharing: Gaussian process behavior, gradient independence, and neural tangent kernel derivation
G Yang - arxiv preprint arxiv:1902.04760, 2019 - arxiv.org
Several recent trends in machine learning theory and practice, from the design of state-of-
the-art Gaussian Process to the convergence analysis of deep neural nets (DNNs) under …
the-art Gaussian Process to the convergence analysis of deep neural nets (DNNs) under …
Priors in bayesian deep learning: A review
V Fortuin - International Statistical Review, 2022 - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …
recent Bayesian deep learning models have often fallen back on vague priors, such as …
Deep convolutional networks as shallow gaussian processes
We show that the output of a (residual) convolutional neural network (CNN) with an
appropriate prior over the weights and biases is a Gaussian process (GP) in the limit of …
appropriate prior over the weights and biases is a Gaussian process (GP) in the limit of …
Data learning: Integrating data assimilation and machine learning
Data Assimilation (DA) is the approximation of the true state of some physical system by
combining observations with a dynamic model. DA incorporates observational data into a …
combining observations with a dynamic model. DA incorporates observational data into a …
Wide feedforward or recurrent neural networks of any architecture are gaussian processes
G Yang - Advances in Neural Information Processing …, 2019 - proceedings.neurips.cc
Wide neural networks with random weights and biases are Gaussian processes, as
observed by Neal (1995) for shallow networks, and more recently by Lee et al.~(2018) and …
observed by Neal (1995) for shallow networks, and more recently by Lee et al.~(2018) and …