Scaling up gans for text-to-image synthesis
The recent success of text-to-image synthesis has taken the world by storm and captured the
general public's imagination. From a technical standpoint, it also marked a drastic change in …
general public's imagination. From a technical standpoint, it also marked a drastic change in …
Extracting training data from diffusion models
Image diffusion models such as DALL-E 2, Imagen, and Stable Diffusion have attracted
significant attention due to their ability to generate high-quality synthetic images. In this work …
significant attention due to their ability to generate high-quality synthetic images. In this work …
Multi-concept customization of text-to-image diffusion
While generative models produce high-quality images of concepts learned from a large-
scale database, a user often wishes to synthesize instantiations of their own concepts (for …
scale database, a user often wishes to synthesize instantiations of their own concepts (for …
Patch diffusion: Faster and more data-efficient training of diffusion models
Diffusion models are powerful, but they require a lot of time and data to train. We propose
Patch Diffusion, a generic patch-wise training framework, to significantly reduce the training …
Patch Diffusion, a generic patch-wise training framework, to significantly reduce the training …
Stylegan-t: Unlocking the power of gans for fast large-scale text-to-image synthesis
Text-to-image synthesis has recently seen significant progress thanks to large pretrained
language models, large-scale training data, and the introduction of scalable model families …
language models, large-scale training data, and the introduction of scalable model families …
Ablating concepts in text-to-image diffusion models
Large-scale text-to-image diffusion models can generate high-fidelity images with powerful
compositional ability. However, these models are typically trained on an enormous amount …
compositional ability. However, these models are typically trained on an enormous amount …
Flow straight and fast: Learning to generate and transfer data with rectified flow
We present rectified flow, a surprisingly simple approach to learning (neural) ordinary
differential equation (ODE) models to transport between two empirically observed …
differential equation (ODE) models to transport between two empirically observed …
Dataset distillation by matching training trajectories
Dataset distillation is the task of synthesizing a small dataset such that a model trained on
the synthetic set will match the test accuracy of the model trained on the full dataset. In this …
the synthetic set will match the test accuracy of the model trained on the full dataset. In this …