Dynamical variational autoencoders: A comprehensive review
Variational autoencoders (VAEs) are powerful deep generative models widely used to
represent high-dimensional complex data through a low-dimensional latent space learned …
represent high-dimensional complex data through a low-dimensional latent space learned …
The free energy principle for perception and action: A deep learning perspective
The free energy principle, and its corollary active inference, constitute a bio-inspired theory
that assumes biological agents act to remain in a restricted set of preferred states of the …
that assumes biological agents act to remain in a restricted set of preferred states of the …
Score-based generative modeling in latent space
Score-based generative models (SGMs) have recently demonstrated impressive results in
terms of both sample quality and distribution coverage. However, they are usually applied …
terms of both sample quality and distribution coverage. However, they are usually applied …
NVAE: A deep hierarchical variational autoencoder
Normalizing flows, autoregressive models, variational autoencoders (VAEs), and deep
energy-based models are among competing likelihood-based frameworks for deep …
energy-based models are among competing likelihood-based frameworks for deep …
Maximum likelihood training of score-based diffusion models
Score-based diffusion models synthesize samples by reversing a stochastic process that
diffuses data to noise, and are trained by minimizing a weighted combination of score …
diffuses data to noise, and are trained by minimizing a weighted combination of score …
Step-unrolled denoising autoencoders for text generation
In this paper we propose a new generative model of text, Step-unrolled Denoising
Autoencoder (SUNDAE), that does not rely on autoregressive models. Similarly to denoising …
Autoencoder (SUNDAE), that does not rely on autoregressive models. Similarly to denoising …
Don't blame the elbo! a linear vae perspective on posterior collapse
Abstract Posterior collapse in Variational Autoencoders (VAEs) with uninformative priors
arises when the variational posterior distribution closely matches the prior for a subset of …
arises when the variational posterior distribution closely matches the prior for a subset of …
Maximum likelihood training of implicit nonlinear diffusion model
Whereas diverse variations of diffusion models exist, extending the linear diffusion into a
nonlinear diffusion process is investigated by very few works. The nonlinearity effect has …
nonlinear diffusion process is investigated by very few works. The nonlinearity effect has …
Learning energy-based prior model with diffusion-amortized mcmc
Latent space EBMs, also known as energy-based priors, have drawn growing interests in
the field of generative modeling due to its flexibility in the formulation and strong modeling …
the field of generative modeling due to its flexibility in the formulation and strong modeling …
Bilateral variational autoencoder for collaborative filtering
Preference data is a form of dyadic data, with measurements associated with pairs of
elements arising from two discrete sets of objects. These are users and items, as well as …
elements arising from two discrete sets of objects. These are users and items, as well as …