Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices

S Vempala, A Wibisono - Advances in neural information …, 2019 - proceedings.neurips.cc
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …

Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein space

MZ Diao, K Balasubramanian… - … on Machine Learning, 2023 - proceedings.mlr.press
Variational inference (VI) seeks to approximate a target distribution $\pi $ by an element of a
tractable family of distributions. Of key interest in statistics and machine learning is Gaussian …

[PDF][PDF] Statistical optimal transport

S Chewi, J Niles-Weed, P Rigollet - arxiv preprint arxiv:2407.18163, 2024 - arxiv.org
Statistical Optimal Transport arxiv:2407.18163v2 [math.ST] 7 Nov 2024 Page 1 Statistical
Optimal Transport Sinho Chewi Yale Jonathan Niles-Weed NYU Philippe Rigollet MIT …

Towards a complete analysis of langevin monte carlo: Beyond poincaré inequality

A Mousavi-Hosseini, TK Farghly, Y He… - The Thirty Sixth …, 2023 - proceedings.mlr.press
Langevin diffusions are rapidly convergent under appropriate functional inequality
assumptions. Hence, it is natural to expect that with additional smoothness conditions to …

Convergence of Stein variational gradient descent under a weaker smoothness condition

L Sun, A Karagulyan… - … Conference on Artificial …, 2023 - proceedings.mlr.press
Abstract Stein Variational Gradient Descent (SVGD) is an important alternative to the
Langevin-type algorithms for sampling from probability distributions of the form $\pi …

Resolving the mixing time of the Langevin algorithm to its stationary distribution for log-concave sampling

JM Altschuler, K Talwar - arxiv preprint arxiv:2210.08448, 2022 - arxiv.org
Sampling from a high-dimensional distribution is a fundamental task in statistics,
engineering, and the sciences. A canonical approach is the Langevin Algorithm, ie, the …

Provably fast finite particle variants of svgd via virtual particle stochastic approximation

A Das, D Nagaraj - Advances in Neural Information …, 2023 - proceedings.neurips.cc
Abstract Stein Variational Gradient Descent (SVGD) is a popular particle-based variational
inference algorithm with impressive empirical performance across various domains …

Particle-based variational inference with generalized wasserstein gradient flow

Z Cheng, S Zhang, L Yu… - Advances in Neural …, 2024 - proceedings.neurips.cc
Particle-based variational inference methods (ParVIs) such as Stein variational gradient
descent (SVGD) update the particles based on the kernelized Wasserstein gradient flow for …

Nonasymptotic analysis of Stochastic Gradient Hamiltonian Monte Carlo under local conditions for nonconvex optimization

OD Akyildiz, S Sabanis - Journal of Machine Learning Research, 2024 - jmlr.org
We provide a nonasymptotic analysis of the convergence of the stochastic gradient
Hamiltonian Monte Carlo (SGHMC) to a target measure in Wasserstein-2 distance without …

Faster sampling without isoperimetry via diffusion-based Monte Carlo

X Huang, D Zou, H Dong, YA Ma… - The Thirty Seventh …, 2024 - proceedings.mlr.press
To sample from a general target distribution $ p_*\propto e^{-f_*} $ beyond the isoperimetric
condition, Huang et al.(2023) proposed to perform sampling through reverse diffusion …