Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Riemannian diffusion models
Diffusion models are recent state-of-the-art methods for image generation and likelihood
estimation. In this work, we generalize continuous-time diffusion models to arbitrary …
estimation. In this work, we generalize continuous-time diffusion models to arbitrary …
Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
Spherical sliced-wasserstein
Many variants of the Wasserstein distance have been introduced to reduce its original
computational burden. In particular the Sliced-Wasserstein distance (SW), which leverages …
computational burden. In particular the Sliced-Wasserstein distance (SW), which leverages …
Convergence of kinetic langevin monte carlo on lie groups
Explicit, momentum-based dynamics for optimizing functions defined on Lie groups was
recently constructed, based on techniques such as variational optimization and left …
recently constructed, based on techniques such as variational optimization and left …
Convergence of the riemannian langevin algorithm
We study the Riemannian Langevin Algorithm for the problem of sampling from a distribution
with density $\nu $ with respect to the natural measure on a manifold with metric $ g $. We …
with density $\nu $ with respect to the natural measure on a manifold with metric $ g $. We …
Stereographic spherical sliced wasserstein distances
Comparing spherical probability distributions is of great interest in various fields, including
geology, medical domains, computer vision, and deep representation learning. The utility of …
geology, medical domains, computer vision, and deep representation learning. The utility of …
Projected stochastic gradient langevin algorithms for constrained sampling and non-convex learning
A Lamperski - Conference on Learning Theory, 2021 - proceedings.mlr.press
Langevin algorithms are gradient descent methods with additive noise. They have been
used for decades in Markov Chain Monte Carlo (MCMC) sampling, optimization, and …
used for decades in Markov Chain Monte Carlo (MCMC) sampling, optimization, and …
Sampling in constrained domains with orthogonal-space variational gradient descent
Sampling methods, as important inference and learning techniques, are typically designed
for unconstrained domains. However, constraints are ubiquitous in machine learning …
for unconstrained domains. However, constraints are ubiquitous in machine learning …
Efficient sampling on Riemannian manifolds via Langevin MCMC
We study the task of efficiently sampling from a Gibbs distribution $ d\pi^*= e^{-h} d {\text
{vol}} _g $ over a Riemannian manifold $ M $ via (geometric) Langevin MCMC; this …
{vol}} _g $ over a Riemannian manifold $ M $ via (geometric) Langevin MCMC; this …
Learning rate free sampling in constrained domains
We introduce a suite of new particle-based algorithms for sampling in constrained domains
which are entirely learning rate free. Our approach leverages coin betting ideas from convex …
which are entirely learning rate free. Our approach leverages coin betting ideas from convex …