Normalizing flows for probabilistic modeling and inference

G Papamakarios, E Nalisnick, DJ Rezende… - Journal of Machine …, 2021 - jmlr.org
Normalizing flows provide a general mechanism for defining expressive probability
distributions, only requiring the specification of a (usually simple) base distribution and a …

Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models

S Bond-Taylor, A Leach, Y Long… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …

Normalizing flows: An introduction and review of current methods

I Kobyzev, SJD Prince… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
Normalizing Flows are generative models which produce tractable distributions where both
sampling and density evaluation can be efficient and exact. The goal of this survey article is …

Maximum likelihood training of score-based diffusion models

Y Song, C Durkan, I Murray… - Advances in neural …, 2021 - proceedings.neurips.cc
Score-based diffusion models synthesize samples by reversing a stochastic process that
diffuses data to noise, and are trained by minimizing a weighted combination of score …

Analog bits: Generating discrete data using diffusion models with self-conditioning

T Chen, R Zhang, G Hinton - arxiv preprint arxiv:2208.04202, 2022 - arxiv.org
We present Bit Diffusion: a simple and generic approach for generating discrete data with
continuous state and continuous time diffusion models. The main idea behind our approach …

Argmax flows and multinomial diffusion: Learning categorical distributions

E Hoogeboom, D Nielsen, P Jaini… - Advances in Neural …, 2021 - proceedings.neurips.cc
Generative flows and diffusion models have been predominantly trained on ordinal data, for
example natural images. This paper introduces two extensions of flows and diffusion for …

Language modeling is compression

G Delétang, A Ruoss, PA Duquenne, E Catt… - arxiv preprint arxiv …, 2023 - arxiv.org
It has long been established that predictive models can be transformed into lossless
compressors and vice versa. Incidentally, in recent years, the machine learning community …

An introduction to neural data compression

Y Yang, S Mandt, L Theis - Foundations and Trends® in …, 2023 - nowpublishers.com
Neural compression is the application of neural networks and other machine learning
methods to data compression. Recent advances in statistical machine learning have opened …

Why deep generative modeling?

JM Tomczak - Deep Generative Modeling, 2024 - Springer
Before we start thinking about (deep) generative modeling, let us consider a simple
example. Imagine we have trained a deep neural network that classifies images (x∈ ℤ D) of …

End-to-end optimized versatile image compression with wavelet-like transform

H Ma, D Liu, N Yan, H Li, F Wu - IEEE Transactions on Pattern …, 2020 - ieeexplore.ieee.org
Built on deep networks, end-to-end optimized image compression has made impressive
progress in the past few years. Previous studies usually adopt a compressive auto-encoder …