Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models

S Bond-Taylor, A Leach, Y Long… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …

Normalizing flows for probabilistic modeling and inference

G Papamakarios, E Nalisnick, DJ Rezende… - Journal of Machine …, 2021 - jmlr.org
Normalizing flows provide a general mechanism for defining expressive probability
distributions, only requiring the specification of a (usually simple) base distribution and a …

The frontier of simulation-based inference

K Cranmer, J Brehmer, G Louppe - … of the National Academy of Sciences, 2020 - pnas.org
Many domains of science have developed complex simulations to describe phenomena of
interest. While these simulations provide high-fidelity models, they are poorly suited for …

Normalizing flows: An introduction and review of current methods

I Kobyzev, SJD Prince… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
Normalizing Flows are generative models which produce tractable distributions where both
sampling and density evaluation can be efficient and exact. The goal of this survey article is …

Neural spline flows

C Durkan, A Bekasov, I Murray… - Advances in neural …, 2019 - proceedings.neurips.cc
A normalizing flow models a complex probability density as an invertible transformation of a
simple base density. Flows based on either coupling or autoregressive transforms both offer …

Normalizing flows on tori and spheres

DJ Rezende, G Papamakarios… - International …, 2020 - proceedings.mlr.press
Normalizing flows are a powerful tool for building expressive distributions in high
dimensions. So far, most of the literature has concentrated on learning flows on Euclidean …

Causal autoregressive flows

I Khemakhem, R Monti, R Leech… - International …, 2021 - proceedings.mlr.press
Two apparently unrelated fields—normalizing flows and causality—have recently received
considerable attention in the machine learning community. In this work, we highlight an …

On contrastive learning for likelihood-free inference

C Durkan, I Murray… - … conference on machine …, 2020 - proceedings.mlr.press
Likelihood-free methods perform parameter inference in stochastic simulator models where
evaluating the likelihood is intractable but sampling synthetic data is possible. One class of …

An unfolding method based on conditional Invertible Neural Networks (cINN) using iterative training

M Backes, A Butter, M Dunford, B Malaescu - SciPost Physics Core, 2024 - scipost.org
The unfolding of detector effects is crucial for the comparison of data to theory predictions.
While traditional methods are limited to representing the data in a low number of …

Generative networks for precision enthusiasts

A Butter, T Heimel, S Hummerich, T Krebs, T Plehn… - SciPost Physics, 2023 - scipost.org
Generative networks are opening new avenues in fast event generation for the LHC. We
show how generative flow networks can reach percent-level precision for kinematic …