Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein space
Variational inference (VI) seeks to approximate a target distribution $\pi $ by an element of a
tractable family of distributions. Of key interest in statistics and machine learning is Gaussian …
tractable family of distributions. Of key interest in statistics and machine learning is Gaussian …
[PDF][PDF] Statistical optimal transport
Statistical Optimal Transport arxiv:2407.18163v2 [math.ST] 7 Nov 2024 Page 1 Statistical
Optimal Transport Sinho Chewi Yale Jonathan Niles-Weed NYU Philippe Rigollet MIT …
Optimal Transport Sinho Chewi Yale Jonathan Niles-Weed NYU Philippe Rigollet MIT …
Towards a complete analysis of langevin monte carlo: Beyond poincaré inequality
Langevin diffusions are rapidly convergent under appropriate functional inequality
assumptions. Hence, it is natural to expect that with additional smoothness conditions to …
assumptions. Hence, it is natural to expect that with additional smoothness conditions to …
Convergence of Stein variational gradient descent under a weaker smoothness condition
Abstract Stein Variational Gradient Descent (SVGD) is an important alternative to the
Langevin-type algorithms for sampling from probability distributions of the form $\pi …
Langevin-type algorithms for sampling from probability distributions of the form $\pi …
Resolving the mixing time of the Langevin algorithm to its stationary distribution for log-concave sampling
Sampling from a high-dimensional distribution is a fundamental task in statistics,
engineering, and the sciences. A canonical approach is the Langevin Algorithm, ie, the …
engineering, and the sciences. A canonical approach is the Langevin Algorithm, ie, the …
Provably fast finite particle variants of svgd via virtual particle stochastic approximation
Abstract Stein Variational Gradient Descent (SVGD) is a popular particle-based variational
inference algorithm with impressive empirical performance across various domains …
inference algorithm with impressive empirical performance across various domains …
Particle-based variational inference with generalized wasserstein gradient flow
Particle-based variational inference methods (ParVIs) such as Stein variational gradient
descent (SVGD) update the particles based on the kernelized Wasserstein gradient flow for …
descent (SVGD) update the particles based on the kernelized Wasserstein gradient flow for …
Nonasymptotic analysis of Stochastic Gradient Hamiltonian Monte Carlo under local conditions for nonconvex optimization
We provide a nonasymptotic analysis of the convergence of the stochastic gradient
Hamiltonian Monte Carlo (SGHMC) to a target measure in Wasserstein-2 distance without …
Hamiltonian Monte Carlo (SGHMC) to a target measure in Wasserstein-2 distance without …
Faster sampling without isoperimetry via diffusion-based Monte Carlo
To sample from a general target distribution $ p_*\propto e^{-f_*} $ beyond the isoperimetric
condition, Huang et al.(2023) proposed to perform sampling through reverse diffusion …
condition, Huang et al.(2023) proposed to perform sampling through reverse diffusion …