Stochastic gradient markov chain monte carlo

C Nemeth, P Fearnhead - Journal of the American Statistical …, 2021 - Taylor & Francis
Abstract Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the gold
standard technique for Bayesian inference. They are theoretically well-understood and …

A survey of uncertainty in deep neural networks

J Gawlikowski, CRN Tassi, M Ali, J Lee, M Humt… - Artificial Intelligence …, 2023 - Springer
Over the last decade, neural networks have reached almost every field of science and
become a crucial part of various real world applications. Due to the increasing spread …

Learning generative vision transformer with energy-based latent space for saliency prediction

J Zhang, J **e, N Barnes, P Li - Advances in Neural …, 2021 - proceedings.neurips.cc
Vision transformer networks have shown superiority in many computer vision tasks. In this
paper, we take a step further by proposing a novel generative vision transformer with latent …

The zig-zag process and super-efficient sampling for Bayesian analysis of big data

J Bierkens, P Fearnhead, G Roberts - 2019 - projecteuclid.org
The Zig-Zag process and super-efficient sampling for Bayesian analysis of big data Page 1 The
Annals of Statistics 2019, Vol. 47, No. 3, 1288–1320 https://doi.org/10.1214/18-AOS1715 © …

Global convergence of Langevin dynamics based algorithms for nonconvex optimization

P Xu, J Chen, D Zou, Q Gu - Advances in Neural …, 2018 - proceedings.neurips.cc
We present a unified framework to analyze the global convergence of Langevin dynamics
based algorithms for nonconvex finite-sum optimization with $ n $ component functions. At …

Langevin monte carlo for contextual bandits

P Xu, H Zheng, EV Mazumdar… - International …, 2022 - proceedings.mlr.press
We study the efficiency of Thompson sampling for contextual bandits. Existing Thompson
sampling-based algorithms need to construct a Laplace approximation (ie, a Gaussian …

Structured logconcave sampling with a restricted Gaussian oracle

YT Lee, R Shen, K Tian - Conference on Learning Theory, 2021 - proceedings.mlr.press
We give algorithms for sampling several structured logconcave families to high accuracy.
We further develop a reduction framework, inspired by proximal point methods in convex …

Control variates for stochastic gradient MCMC

J Baker, P Fearnhead, EB Fox, C Nemeth - Statistics and Computing, 2019 - Springer
It is well known that Markov chain Monte Carlo (MCMC) methods scale poorly with dataset
size. A popular class of methods for solving this issue is stochastic gradient MCMC …

The promises and pitfalls of stochastic gradient Langevin dynamics

N Brosse, A Durmus… - Advances in Neural …, 2018 - proceedings.neurips.cc
Abstract Stochastic Gradient Langevin Dynamics (SGLD) has emerged as a key MCMC
algorithm for Bayesian learning from large scale datasets. While SGLD with decreasing step …

Piecewise deterministic Markov processes for continuous-time Monte Carlo

P Fearnhead, J Bierkens, M Pollock, GO Roberts - Statistical Science, 2018 - JSTOR
Recently, there have been conceptually new developments in Monte Carlo methods through
the introduction of new MCMC and sequential Monte Carlo (SMC) algorithms which are …