Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions
We provide theoretical convergence guarantees for score-based generative models (SGMs)
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …
The probability flow ode is provably fast
We provide the first polynomial-time convergence guarantees for the probabilistic flow ODE
implementation (together with a corrector step) of score-based generative modeling. Our …
implementation (together with a corrector step) of score-based generative modeling. Our …
Rapid convergence of the unadjusted langevin algorithm: Isoperimetry suffices
S Vempala, A Wibisono - Advances in neural information …, 2019 - proceedings.neurips.cc
Abstract We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
distribution $\nu= e^{-f} $ on $\R^ n $. We prove a convergence guarantee in Kullback …
Faster high-accuracy log-concave sampling via algorithmic warm starts
JM Altschuler, S Chewi - Journal of the ACM, 2024 - dl.acm.org
It is a fundamental problem to understand the complexity of high-accuracy sampling from a
strongly log-concave density π on ℝ d. Indeed, in practice, high-accuracy samplers such as …
strongly log-concave density π on ℝ d. Indeed, in practice, high-accuracy samplers such as …
Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein space
MZ Diao, K Balasubramanian… - … on Machine Learning, 2023 - proceedings.mlr.press
Variational inference (VI) seeks to approximate a target distribution $\pi $ by an element of a
tractable family of distributions. Of key interest in statistics and machine learning is Gaussian …
tractable family of distributions. Of key interest in statistics and machine learning is Gaussian …
Improved discretization analysis for underdamped Langevin Monte Carlo
Abstract Underdamped Langevin Monte Carlo (ULMC) is an algorithm used to sample from
unnormalized densities by leveraging the momentum of a particle moving in a potential well …
unnormalized densities by leveraging the momentum of a particle moving in a potential well …
[PDF][PDF] Statistical optimal transport
Statistical Optimal Transport arxiv:2407.18163v2 [math.ST] 7 Nov 2024 Page 1 Statistical
Optimal Transport Sinho Chewi Yale Jonathan Niles-Weed NYU Philippe Rigollet MIT …
Optimal Transport Sinho Chewi Yale Jonathan Niles-Weed NYU Philippe Rigollet MIT …
Deep networks as denoising algorithms: Sample-efficient learning of diffusion models in high-dimensional graphical models
We investigate the efficiency of deep neural networks for approximating scoring functions in
diffusion-based generative modeling. While existing approximation theories leverage the …
diffusion-based generative modeling. While existing approximation theories leverage the …
Bourgain's slicing problem and KLS isoperimetry up to polylog
B Klartag, J Lehec - Geometric and functional analysis, 2022 - Springer
Bourgain’s slicing problem and KLS isoperimetry up to polylog | Geometric and Functional
Analysis Skip to main content Springer Nature Link Account Menu Find a journal Publish with us …
Analysis Skip to main content Springer Nature Link Account Menu Find a journal Publish with us …
Improved dimension dependence of a proximal algorithm for sampling
We propose a sampling algorithm that achieves superior complexity bounds in all the
classical settings (strongly log-concave, log-concave, Logarithmic-Sobolev inequality (LSI) …
classical settings (strongly log-concave, log-concave, Logarithmic-Sobolev inequality (LSI) …