Dynamical variational autoencoders: A comprehensive review
Variational autoencoders (VAEs) are powerful deep generative models widely used to
represent high-dimensional complex data through a low-dimensional latent space learned …
represent high-dimensional complex data through a low-dimensional latent space learned …
Memory as a computational resource
Computer scientists have long recognized that naive implementations of algorithms often
result in a paralyzing degree of redundant computation. More sophisticated implementations …
result in a paralyzing degree of redundant computation. More sophisticated implementations …
Disentangling disentanglement in variational autoencoders
We develop a generalisation of disentanglement in variational autoencoders (VAEs)—
decomposition of the latent representation—characterising it as the fulfilment of two factors …
decomposition of the latent representation—characterising it as the fulfilment of two factors …
Deep variational reinforcement learning for POMDPs
Many real-world sequential decision making problems are partially observable by nature,
and the environment model is typically unknown. Consequently, there is great need for …
and the environment model is typically unknown. Consequently, there is great need for …
Sequential attend, infer, repeat: Generative modelling of moving objects
Abstract We present Sequential Attend, Infer, Repeat (SQAIR), an interpretable deep
generative model for image sequences. It can reliably discover and track objects through the …
generative model for image sequences. It can reliably discover and track objects through the …
Score-based data assimilation
F Rozet, G Louppe - Advances in Neural Information …, 2023 - proceedings.neurips.cc
Data assimilation, in its most comprehensive form, addresses the Bayesian inverse problem
of identifying plausible state trajectories that explain noisy or incomplete observations of …
of identifying plausible state trajectories that explain noisy or incomplete observations of …
An introduction to probabilistic programming
This book is a graduate-level introduction to probabilistic programming. It not only provides a
thorough background for anyone wishing to use a probabilistic programming system, but …
thorough background for anyone wishing to use a probabilistic programming system, but …
Tighter variational bounds are not necessarily better
We provide theoretical and empirical evidence that using tighter evidence lower bounds
(ELBOs) can be detrimental to the process of learning an inference network by reducing the …
(ELBOs) can be detrimental to the process of learning an inference network by reducing the …
Differentiable particle filtering via entropy-regularized optimal transport
Particle Filtering (PF) methods are an established class of procedures for performing
inference in non-linear state-space models. Resampling is a key ingredient of PF necessary …
inference in non-linear state-space models. Resampling is a key ingredient of PF necessary …
Sequential latent knowledge selection for knowledge-grounded dialogue
Knowledge-grounded dialogue is a task of generating an informative response based on
both discourse context and external knowledge. As we focus on better modeling the …
both discourse context and external knowledge. As we focus on better modeling the …