The probability flow ode is provably fast
We provide the first polynomial-time convergence guarantees for the probabilistic flow ODE
implementation (together with a corrector step) of score-based generative modeling. Our …
implementation (together with a corrector step) of score-based generative modeling. Our …
Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions
We provide theoretical convergence guarantees for score-based generative models (SGMs)
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …
Linear convergence bounds for diffusion models via stochastic localization
Diffusion models are a powerful method for generating approximate samples from high-
dimensional data distributions. Several recent results have provided polynomial bounds on …
dimensional data distributions. Several recent results have provided polynomial bounds on …
Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein space
Variational inference (VI) seeks to approximate a target distribution $\pi $ by an element of a
tractable family of distributions. Of key interest in statistics and machine learning is Gaussian …
tractable family of distributions. Of key interest in statistics and machine learning is Gaussian …
Faster high-accuracy log-concave sampling via algorithmic warm starts
It is a fundamental problem to understand the complexity of high-accuracy sampling from a
strongly log-concave density π on ℝ d. Indeed, in practice, high-accuracy samplers such as …
strongly log-concave density π on ℝ d. Indeed, in practice, high-accuracy samplers such as …
Deep networks as denoising algorithms: Sample-efficient learning of diffusion models in high-dimensional graphical models
We investigate the efficiency of deep neural networks for approximating scoring functions in
diffusion-based generative modeling. While existing approximation theories leverage the …
diffusion-based generative modeling. While existing approximation theories leverage the …
[PDF][PDF] Statistical optimal transport
Statistical Optimal Transport arxiv:2407.18163v2 [math.ST] 7 Nov 2024 Page 1 Statistical
Optimal Transport Sinho Chewi Yale Jonathan Niles-Weed NYU Philippe Rigollet MIT …
Optimal Transport Sinho Chewi Yale Jonathan Niles-Weed NYU Philippe Rigollet MIT …
Bourgain's slicing problem and KLS isoperimetry up to polylog
Bourgain’s slicing problem and KLS isoperimetry up to polylog | Geometric and Functional
Analysis Skip to main content SpringerLink Account Menu Find a journal Publish with us Track …
Analysis Skip to main content SpringerLink Account Menu Find a journal Publish with us Track …
Improved discretization analysis for underdamped Langevin Monte Carlo
Abstract Underdamped Langevin Monte Carlo (ULMC) is an algorithm used to sample from
unnormalized densities by leveraging the momentum of a particle moving in a potential well …
unnormalized densities by leveraging the momentum of a particle moving in a potential well …