Kernel mean embedding of distributions: A review and beyond
A Hilbert space embedding of a distribution—in short, a kernel mean embedding—has
recently emerged as a powerful tool for machine learning and statistical inference. The basic …
recently emerged as a powerful tool for machine learning and statistical inference. The basic …
Gaussian processes and kernel methods: A review on connections and equivalences
This paper is an attempt to bridge the conceptual gaps between researchers working on the
two widely used approaches based on positive definite kernels: Bayesian learning or …
two widely used approaches based on positive definite kernels: Bayesian learning or …
Benchmarking simulation-based inference
Recent advances in probabilistic modelling have led to a large number of simulation-based
inference algorithms which do not require numerical evaluation of likelihoods. However, a …
inference algorithms which do not require numerical evaluation of likelihoods. However, a …
Stein variational gradient descent: A general purpose bayesian inference algorithm
We propose a general purpose variational inference algorithm that forms a natural
counterpart of gradient descent for optimization. Our method iteratively transports a set of …
counterpart of gradient descent for optimization. Our method iteratively transports a set of …
A kernelized Stein discrepancy for goodness-of-fit tests
We derive a new discrepancy statistic for measuring differences between two probability
distributions based on combining Stein's identity and the reproducing kernel Hilbert space …
distributions based on combining Stein's identity and the reproducing kernel Hilbert space …
A survey of Monte Carlo methods for parameter estimation
Statistical signal processing applications usually require the estimation of some parameters
of interest given a set of observed data. These estimates are typically obtained either by …
of interest given a set of observed data. These estimates are typically obtained either by …
A universal approximation theorem of deep neural networks for expressing probability distributions
This paper studies the universal approximation property of deep neural networks for
representing probability distributions. Given a target distribution $\pi $ and a source …
representing probability distributions. Given a target distribution $\pi $ and a source …
Stein variational gradient descent as gradient flow
Q Liu - Advances in neural information processing systems, 2017 - proceedings.neurips.cc
Stein variational gradient descent (SVGD) is a deterministic sampling algorithm that
iteratively transports a set of particles to approximate given distributions, based on a …
iteratively transports a set of particles to approximate given distributions, based on a …
Detecting out-of-distribution inputs to deep generative models using typicality
Recent work has shown that deep generative models can assign higher likelihood to out-of-
distribution data sets than to their training data (Nalisnick et al., 2019; Choi et al., 2019). We …
distribution data sets than to their training data (Nalisnick et al., 2019; Choi et al., 2019). We …
Measuring sample quality with kernels
Abstract Approximate Markov chain Monte Carlo (MCMC) offers the promise of more rapid
sampling at the cost of more biased inference. Since standard MCMC diagnostics fail to …
sampling at the cost of more biased inference. Since standard MCMC diagnostics fail to …