How good is the Bayes posterior in deep neural networks really?

F Wenzel, K Roth, BS Veeling, J Świątkowski… - arxiv preprint arxiv …, 2020 - arxiv.org
During the past five years the Bayesian deep learning community has developed
increasingly accurate and efficient approximate inference procedures that allow for …

Efficient and scalable bayesian neural nets with rank-1 factors

M Dusenberry, G Jerfel, Y Wen, Y Ma… - International …, 2020 - proceedings.mlr.press
Bayesian neural networks (BNNs) demonstrate promising success in improving the
robustness and uncertainty quantification of modern deep learning. However, they generally …

Scaling Hamiltonian Monte Carlo inference for Bayesian neural networks with symmetric splitting

AD Cobb, B Jalaian - Uncertainty in Artificial Intelligence, 2021 - proceedings.mlr.press
Abstract Hamiltonian Monte Carlo (HMC) is a Markov chain Monte Carlo (MCMC) approach
that exhibits favourable exploration properties in high-dimensional models such as neural …

All you need is a good functional prior for Bayesian deep learning

BH Tran, S Rossi, D Milios, M Filippone - Journal of Machine Learning …, 2022 - jmlr.org
The Bayesian treatment of neural networks dictates that a prior distribution is specified over
their weight and bias parameters. This poses a challenge because modern neural networks …

On uncertainty, tempering, and data augmentation in bayesian classification

S Kapoor, WJ Maddox, P Izmailov… - Advances in Neural …, 2022 - proceedings.neurips.cc
Aleatoric uncertainty captures the inherent randomness of the data, such as measurement
noise. In Bayesian regression, we often use a Gaussian observation model, where we …

Quantifying uncertainty in deep spatiotemporal forecasting

D Wu, L Gao, M Chinazzi, X **ong… - Proceedings of the 27th …, 2021 - dl.acm.org
Deep learning is gaining increasing popularity for spatiotemporal forecasting. However,
prior works have mostly focused on point estimates without quantifying the uncertainty of the …

Being bayesian about categorical probability

T Joo, U Chung, MG Seo - International conference on …, 2020 - proceedings.mlr.press
Neural networks utilize the softmax as a building block in classification tasks, which contains
an overconfidence problem and lacks an uncertainty representation ability. As a Bayesian …

Scalable Bayesian uncertainty quantification for neural network potentials: promise and pitfalls

S Thaler, G Doehner, J Zavadlav - Journal of Chemical Theory …, 2023 - ACS Publications
Neural network (NN) potentials promise highly accurate molecular dynamics (MD)
simulations within the computational complexity of classical MD force fields. However, when …

Distance-based learning from errors for confidence calibration

C **ng, S Arik, Z Zhang, T Pfister - arxiv preprint arxiv:1912.01730, 2019 - arxiv.org
Deep neural networks (DNNs) are poorly calibrated when trained in conventional ways. To
improve confidence calibration of DNNs, we propose a novel training method, distance …

Low-precision stochastic gradient Langevin dynamics

R Zhang, AG Wilson, C De Sa - International Conference on …, 2022 - proceedings.mlr.press
While low-precision optimization has been widely used to accelerate deep learning, low-
precision sampling remains largely unexplored. As a consequence, sampling is simply …