Sit: Exploring flow and diffusion-based generative models with scalable interpolant transformers
Abstract We present Scalable Interpolant Transformers (SiT), a family of generative models
built on the backbone of Diffusion Transformers (DiT). The interpolant framework, which …
built on the backbone of Diffusion Transformers (DiT). The interpolant framework, which …
Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions
We provide theoretical convergence guarantees for score-based generative models (SGMs)
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …
such as denoising diffusion probabilistic models (DDPMs), which constitute the backbone of …
Stochastic interpolants: A unifying framework for flows and diffusions
A class of generative models that unifies flow-based and diffusion-based methods is
introduced. These models extend the framework proposed in Albergo & Vanden-Eijnden …
introduced. These models extend the framework proposed in Albergo & Vanden-Eijnden …
The probability flow ode is provably fast
We provide the first polynomial-time convergence guarantees for the probabilistic flow ODE
implementation (together with a corrector step) of score-based generative modeling. Our …
implementation (together with a corrector step) of score-based generative modeling. Our …
Restoration-degradation beyond linear diffusions: A non-asymptotic analysis for ddim-type samplers
We develop a framework for non-asymptotic analysis of deterministic samplers used for
diffusion generative modeling. Several recent works have analyzed stochastic samplers …
diffusion generative modeling. Several recent works have analyzed stochastic samplers …
Linear convergence bounds for diffusion models via stochastic localization
Diffusion models are a powerful method for generating approximate samples from high-
dimensional data distributions. Several recent results have provided polynomial bounds on …
dimensional data distributions. Several recent results have provided polynomial bounds on …
Learning mixtures of gaussians using the DDPM objective
Recent works have shown that diffusion models can learn essentially any distribution
provided one can perform score estimation. Yet it remains poorly understood under what …
provided one can perform score estimation. Yet it remains poorly understood under what …
White-box transformers via sparse rate reduction
In this paper, we contend that the objective of representation learning is to compress and
transform the distribution of the data, say sets of tokens, towards a mixture of low …
transform the distribution of the data, say sets of tokens, towards a mixture of low …
On the generalization properties of diffusion models
Diffusion models are a class of generative models that serve to establish a stochastic
transport map between an empirically observed, yet unknown, target distribution and a …
transport map between an empirically observed, yet unknown, target distribution and a …
Towards faster non-asymptotic convergence for diffusion-based generative models
Diffusion models, which convert noise into new data instances by learning to reverse a
Markov diffusion process, have become a cornerstone in contemporary generative …
Markov diffusion process, have become a cornerstone in contemporary generative …