Sit: Exploring flow and diffusion-based generative models with scalable interpolant transformers

N Ma, M Goldstein, MS Albergo, NM Boffi… - … on Computer Vision, 2024 - Springer
Abstract We present Scalable Interpolant Transformers (SiT), a family of generative models
built on the backbone of Diffusion Transformers (DiT). The interpolant framework, which …

The probability flow ode is provably fast

S Chen, S Chewi, H Lee, Y Li, J Lu… - Advances in Neural …, 2024 - proceedings.neurips.cc
We provide the first polynomial-time convergence guarantees for the probabilistic flow ODE
implementation (together with a corrector step) of score-based generative modeling. Our …

Linear convergence bounds for diffusion models via stochastic localization

J Benton, V De Bortoli, A Doucet… - arxiv preprint arxiv …, 2023 - arxiv.org
Diffusion models are a powerful method for generating approximate samples from high-
dimensional data distributions. Several recent results have provided polynomial bounds on …

Learning mixtures of gaussians using the DDPM objective

K Shah, S Chen, A Klivans - Advances in Neural …, 2023 - proceedings.neurips.cc
Recent works have shown that diffusion models can learn essentially any distribution
provided one can perform score estimation. Yet it remains poorly understood under what …

Towards faster non-asymptotic convergence for diffusion-based generative models

G Li, Y Wei, Y Chen, Y Chi - arxiv preprint arxiv:2306.09251, 2023 - arxiv.org
Diffusion models, which convert noise into new data instances by learning to reverse a
Markov diffusion process, have become a cornerstone in contemporary generative …

A sharp convergence theory for the probability flow odes of diffusion models

G Li, Y Wei, Y Chi, Y Chen - arxiv preprint arxiv:2408.02320, 2024 - arxiv.org
Diffusion models, which convert noise into new data instances by learning to reverse a
diffusion process, have become a cornerstone in contemporary generative modeling. In this …

Unraveling the smoothness properties of diffusion models: A gaussian mixture perspective

Y Liang, Z Shi, Z Song, Y Zhou - arxiv preprint arxiv:2405.16418, 2024 - arxiv.org
Diffusion models have made rapid progress in generating high-quality samples across
various domains. However, a theoretical understanding of the Lipschitz continuity and …

Accelerating convergence of score-based diffusion models, provably

G Li, Y Huang, T Efimov, Y Wei, Y Chi… - arxiv preprint arxiv …, 2024 - arxiv.org
Score-based diffusion models, while achieving remarkable empirical performance, often
suffer from low sampling speed, due to extensive function evaluations needed during the …

Stochastic runge-kutta methods: Provable acceleration of diffusion models

Y Wu, Y Chen, Y Wei - arxiv preprint arxiv:2410.04760, 2024 - arxiv.org
Diffusion models play a pivotal role in contemporary generative modeling, claiming state-of-
the-art performance across various domains. Despite their superior sample quality …

Contractive diffusion probabilistic models

W Tang, H Zhao - arxiv preprint arxiv:2401.13115, 2024 - arxiv.org
Diffusion probabilistic models (DPMs) have emerged as a promising technology in
generative modeling. The success of DPMs relies on two ingredients: time reversal of …