Riemannian diffusion models

CW Huang, M Aghajohari, J Bose… - Advances in …, 2022 - proceedings.neurips.cc
Diffusion models are recent state-of-the-art methods for image generation and likelihood
estimation. In this work, we generalize continuous-time diffusion models to arbitrary …

Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices

S Vempala, A Wibisono - Advances in neural information …, 2019 - proceedings.neurips.cc
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …

Spherical sliced-wasserstein

C Bonet, P Berg, N Courty, F Septier, L Drumetz… - arxiv preprint arxiv …, 2022 - arxiv.org
Many variants of the Wasserstein distance have been introduced to reduce its original
computational burden. In particular the Sliced-Wasserstein distance (SW), which leverages …

Convergence of kinetic langevin monte carlo on lie groups

L Kong, M Tao - The Thirty Seventh Annual Conference on …, 2024 - proceedings.mlr.press
Explicit, momentum-based dynamics for optimizing functions defined on Lie groups was
recently constructed, based on techniques such as variational optimization and left …

Convergence of the riemannian langevin algorithm

K Gatmiry, SS Vempala - arxiv preprint arxiv:2204.10818, 2022 - arxiv.org
We study the Riemannian Langevin Algorithm for the problem of sampling from a distribution
with density $\nu $ with respect to the natural measure on a manifold with metric $ g $. We …

Stereographic spherical sliced wasserstein distances

H Tran, Y Bai, A Kothapalli, A Shahbazi, X Liu… - arxiv preprint arxiv …, 2024 - arxiv.org
Comparing spherical probability distributions is of great interest in various fields, including
geology, medical domains, computer vision, and deep representation learning. The utility of …

Projected stochastic gradient langevin algorithms for constrained sampling and non-convex learning

A Lamperski - Conference on Learning Theory, 2021 - proceedings.mlr.press
Langevin algorithms are gradient descent methods with additive noise. They have been
used for decades in Markov Chain Monte Carlo (MCMC) sampling, optimization, and …

Sampling in constrained domains with orthogonal-space variational gradient descent

R Zhang, Q Liu, X Tong - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Sampling methods, as important inference and learning techniques, are typically designed
for unconstrained domains. However, constraints are ubiquitous in machine learning …

Efficient sampling on Riemannian manifolds via Langevin MCMC

X Cheng, J Zhang, S Sra - Advances in Neural Information …, 2022 - proceedings.neurips.cc
We study the task of efficiently sampling from a Gibbs distribution $ d\pi^*= e^{-h} d {\text
{vol}} _g $ over a Riemannian manifold $ M $ via (geometric) Langevin MCMC; this …

Learning rate free sampling in constrained domains

L Sharrock, L Mackey… - Advances in Neural …, 2023 - proceedings.neurips.cc
We introduce a suite of new particle-based algorithms for sampling in constrained domains
which are entirely learning rate free. Our approach leverages coin betting ideas from convex …