Normalizing flows for probabilistic modeling and inference
Normalizing flows provide a general mechanism for defining expressive probability
distributions, only requiring the specification of a (usually simple) base distribution and a …
distributions, only requiring the specification of a (usually simple) base distribution and a …
Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …
the distribution of training samples. Research has fragmented into various interconnected …
[PDF][PDF] XLNet: Generalized Autoregressive Pretraining for Language Understanding
Z Yang - arxiv preprint arxiv:1906.08237, 2019 - fq.pkwyx.com
With the capability of modeling bidirectional contexts, denoising autoencoding based
pretraining like BERT achieves better performance than pretraining approaches based on …
pretraining like BERT achieves better performance than pretraining approaches based on …
[CITATION][C] An introduction to variational autoencoders
An Introduction to Variational Autoencoders Page 1 An Introduction to Variational Autoencoders
Page 2 Other titles in Foundations and Trends R in Machine Learning Computational Optimal …
Page 2 Other titles in Foundations and Trends R in Machine Learning Computational Optimal …
Gan inversion: A survey
GAN inversion aims to invert a given image back into the latent space of a pretrained GAN
model so that the image can be faithfully reconstructed from the inverted code by the …
model so that the image can be faithfully reconstructed from the inverted code by the …
Normalizing flows: An introduction and review of current methods
Normalizing Flows are generative models which produce tractable distributions where both
sampling and density evaluation can be efficient and exact. The goal of this survey article is …
sampling and density evaluation can be efficient and exact. The goal of this survey article is …
Density estimation using real nvp
Unsupervised learning of probabilistic models is a central yet challenging problem in
machine learning. Specifically, designing models with tractable learning, sampling …
machine learning. Specifically, designing models with tractable learning, sampling …
Pixel recurrent neural networks
Modeling the distribution of natural images is a landmark problem in unsupervised learning.
This task requires an image model that is at once expressive, tractable and scalable. We …
This task requires an image model that is at once expressive, tractable and scalable. We …
The frontier of simulation-based inference
Many domains of science have developed complex simulations to describe phenomena of
interest. While these simulations provide high-fidelity models, they are poorly suited for …
interest. While these simulations provide high-fidelity models, they are poorly suited for …
Improved variational inference with inverse autoregressive flow
The framework of normalizing flows provides a general strategy for flexible variational
inference of posteriors over latent variables. We propose a new type of normalizing flow …
inference of posteriors over latent variables. We propose a new type of normalizing flow …