Normalizing flows for probabilistic modeling and inference
Normalizing flows provide a general mechanism for defining expressive probability
distributions, only requiring the specification of a (usually simple) base distribution and a …
distributions, only requiring the specification of a (usually simple) base distribution and a …
Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …
the distribution of training samples. Research has fragmented into various interconnected …
Normalizing flows: An introduction and review of current methods
Normalizing Flows are generative models which produce tractable distributions where both
sampling and density evaluation can be efficient and exact. The goal of this survey article is …
sampling and density evaluation can be efficient and exact. The goal of this survey article is …
Maximum likelihood training of score-based diffusion models
Score-based diffusion models synthesize samples by reversing a stochastic process that
diffuses data to noise, and are trained by minimizing a weighted combination of score …
diffuses data to noise, and are trained by minimizing a weighted combination of score …
Analog bits: Generating discrete data using diffusion models with self-conditioning
We present Bit Diffusion: a simple and generic approach for generating discrete data with
continuous state and continuous time diffusion models. The main idea behind our approach …
continuous state and continuous time diffusion models. The main idea behind our approach …
Argmax flows and multinomial diffusion: Learning categorical distributions
Generative flows and diffusion models have been predominantly trained on ordinal data, for
example natural images. This paper introduces two extensions of flows and diffusion for …
example natural images. This paper introduces two extensions of flows and diffusion for …
Language modeling is compression
It has long been established that predictive models can be transformed into lossless
compressors and vice versa. Incidentally, in recent years, the machine learning community …
compressors and vice versa. Incidentally, in recent years, the machine learning community …
An introduction to neural data compression
Neural compression is the application of neural networks and other machine learning
methods to data compression. Recent advances in statistical machine learning have opened …
methods to data compression. Recent advances in statistical machine learning have opened …
Why deep generative modeling?
JM Tomczak - Deep Generative Modeling, 2024 - Springer
Before we start thinking about (deep) generative modeling, let us consider a simple
example. Imagine we have trained a deep neural network that classifies images (x∈ ℤ D) of …
example. Imagine we have trained a deep neural network that classifies images (x∈ ℤ D) of …
End-to-end optimized versatile image compression with wavelet-like transform
Built on deep networks, end-to-end optimized image compression has made impressive
progress in the past few years. Previous studies usually adopt a compressive auto-encoder …
progress in the past few years. Previous studies usually adopt a compressive auto-encoder …