Normalizing flows for probabilistic modeling and inference

G Papamakarios, E Nalisnick, DJ Rezende… - Journal of Machine …, 2021 - jmlr.org
Normalizing flows provide a general mechanism for defining expressive probability
distributions, only requiring the specification of a (usually simple) base distribution and a …

Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models

S Bond-Taylor, A Leach, Y Long… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …

[PDF][PDF] XLNet: Generalized Autoregressive Pretraining for Language Understanding

Z Yang - arxiv preprint arxiv:1906.08237, 2019 - fq.pkwyx.com
With the capability of modeling bidirectional contexts, denoising autoencoding based
pretraining like BERT achieves better performance than pretraining approaches based on …

[CITATION][C] An introduction to variational autoencoders

DP Kingma, M Welling - Foundations and Trends® in …, 2019 - nowpublishers.com
An Introduction to Variational Autoencoders Page 1 An Introduction to Variational Autoencoders
Page 2 Other titles in Foundations and Trends R in Machine Learning Computational Optimal …

Gan inversion: A survey

W **a, Y Zhang, Y Yang, JH Xue… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
GAN inversion aims to invert a given image back into the latent space of a pretrained GAN
model so that the image can be faithfully reconstructed from the inverted code by the …

Normalizing flows: An introduction and review of current methods

I Kobyzev, SJD Prince… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
Normalizing Flows are generative models which produce tractable distributions where both
sampling and density evaluation can be efficient and exact. The goal of this survey article is …

Density estimation using real nvp

L Dinh, J Sohl-Dickstein, S Bengio - arxiv preprint arxiv:1605.08803, 2016 - arxiv.org
Unsupervised learning of probabilistic models is a central yet challenging problem in
machine learning. Specifically, designing models with tractable learning, sampling …

Pixel recurrent neural networks

A Van Den Oord, N Kalchbrenner… - … on machine learning, 2016 - proceedings.mlr.press
Modeling the distribution of natural images is a landmark problem in unsupervised learning.
This task requires an image model that is at once expressive, tractable and scalable. We …

The frontier of simulation-based inference

K Cranmer, J Brehmer… - Proceedings of the …, 2020 - National Acad Sciences
Many domains of science have developed complex simulations to describe phenomena of
interest. While these simulations provide high-fidelity models, they are poorly suited for …

Improved variational inference with inverse autoregressive flow

DP Kingma, T Salimans, R Jozefowicz… - Advances in neural …, 2016 - proceedings.neurips.cc
The framework of normalizing flows provides a general strategy for flexible variational
inference of posteriors over latent variables. We propose a new type of normalizing flow …