Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
How good is the Bayes posterior in deep neural networks really?
During the past five years the Bayesian deep learning community has developed
increasingly accurate and efficient approximate inference procedures that allow for …
increasingly accurate and efficient approximate inference procedures that allow for …
Efficient and scalable bayesian neural nets with rank-1 factors
Bayesian neural networks (BNNs) demonstrate promising success in improving the
robustness and uncertainty quantification of modern deep learning. However, they generally …
robustness and uncertainty quantification of modern deep learning. However, they generally …
Scaling Hamiltonian Monte Carlo inference for Bayesian neural networks with symmetric splitting
Abstract Hamiltonian Monte Carlo (HMC) is a Markov chain Monte Carlo (MCMC) approach
that exhibits favourable exploration properties in high-dimensional models such as neural …
that exhibits favourable exploration properties in high-dimensional models such as neural …
All you need is a good functional prior for Bayesian deep learning
The Bayesian treatment of neural networks dictates that a prior distribution is specified over
their weight and bias parameters. This poses a challenge because modern neural networks …
their weight and bias parameters. This poses a challenge because modern neural networks …
On uncertainty, tempering, and data augmentation in bayesian classification
Aleatoric uncertainty captures the inherent randomness of the data, such as measurement
noise. In Bayesian regression, we often use a Gaussian observation model, where we …
noise. In Bayesian regression, we often use a Gaussian observation model, where we …
Quantifying uncertainty in deep spatiotemporal forecasting
Deep learning is gaining increasing popularity for spatiotemporal forecasting. However,
prior works have mostly focused on point estimates without quantifying the uncertainty of the …
prior works have mostly focused on point estimates without quantifying the uncertainty of the …
Being bayesian about categorical probability
Neural networks utilize the softmax as a building block in classification tasks, which contains
an overconfidence problem and lacks an uncertainty representation ability. As a Bayesian …
an overconfidence problem and lacks an uncertainty representation ability. As a Bayesian …
Scalable Bayesian uncertainty quantification for neural network potentials: promise and pitfalls
Neural network (NN) potentials promise highly accurate molecular dynamics (MD)
simulations within the computational complexity of classical MD force fields. However, when …
simulations within the computational complexity of classical MD force fields. However, when …
Distance-based learning from errors for confidence calibration
Deep neural networks (DNNs) are poorly calibrated when trained in conventional ways. To
improve confidence calibration of DNNs, we propose a novel training method, distance …
improve confidence calibration of DNNs, we propose a novel training method, distance …
Low-precision stochastic gradient Langevin dynamics
While low-precision optimization has been widely used to accelerate deep learning, low-
precision sampling remains largely unexplored. As a consequence, sampling is simply …
precision sampling remains largely unexplored. As a consequence, sampling is simply …