Diffusion models: A comprehensive survey of methods and applications

L Yang, Z Zhang, Y Song, S Hong, R Xu, Y Zhao… - ACM Computing …, 2023 - dl.acm.org
Diffusion models have emerged as a powerful new family of deep generative models with
record-breaking performance in many applications, including image synthesis, video …

Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models

S Bond-Taylor, A Leach, Y Long… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …

Compositional visual generation with composable diffusion models

N Liu, S Li, Y Du, A Torralba, JB Tenenbaum - European Conference on …, 2022 - Springer
Large text-guided diffusion models, such as DALLE-2, are able to generate stunning
photorealistic images given natural language descriptions. While such models are highly …

Reduce, reuse, recycle: Compositional generation with energy-based diffusion models and mcmc

Y Du, C Durkan, R Strudel… - International …, 2023 - proceedings.mlr.press
Since their introduction, diffusion models have quickly become the prevailing approach to
generative modeling in many domains. They can be interpreted as learning the gradients of …

Denoising diffusion probabilistic models

J Ho, A Jain, P Abbeel - Advances in neural information …, 2020 - proceedings.neurips.cc
We present high quality image synthesis results using diffusion probabilistic models, a class
of latent variable models inspired by considerations from nonequilibrium thermodynamics …

On aliased resizing and surprising subtleties in gan evaluation

G Parmar, R Zhang, JY Zhu - Proceedings of the IEEE/CVF …, 2022 - openaccess.thecvf.com
Metrics for evaluating generative models aim to measure the discrepancy between real and
generated images. The oftenused Frechet Inception Distance (FID) metric, for example …

Generative modeling by estimating gradients of the data distribution

Y Song, S Ermon - Advances in neural information …, 2019 - proceedings.neurips.cc
We introduce a new generative model where samples are produced via Langevin dynamics
using gradients of the data distribution estimated with score matching. Because gradients …

Cold decoding: Energy-based constrained text generation with langevin dynamics

L Qin, S Welleck, D Khashabi… - Advances in Neural …, 2022 - proceedings.neurips.cc
Many applications of text generation require incorporating different constraints to control the
semantics or style of generated text. These constraints can be hard (eg, ensuring certain …

Your classifier is secretly an energy based model and you should treat it like one

W Grathwohl, KC Wang, JH Jacobsen… - arxiv preprint arxiv …, 2019 - arxiv.org
We propose to reinterpret a standard discriminative classifier of p (y| x) as an energy based
model for the joint distribution p (x, y). In this setting, the standard class probabilities can be …

Learning gradient fields for shape generation

R Cai, G Yang, H Averbuch-Elor, Z Hao… - Computer Vision–ECCV …, 2020 - Springer
In this work, we propose a novel technique to generate shapes from point cloud data. A point
cloud can be viewed as samples from a distribution of 3D points whose density is …