Kernel mean embedding of distributions: A review and beyond
K Muandet, K Fukumizu… - … and Trends® in …, 2017 - nowpublishers.com
A Hilbert space embedding of a distribution—in short, a kernel mean embedding—has
recently emerged as a powerful tool for machine learning and statistical inference. The basic …
recently emerged as a powerful tool for machine learning and statistical inference. The basic …
Adaptive Monte Carlo augmented with normalizing flows
M Gabrié, GM Rotskoff… - Proceedings of the …, 2022 - National Acad Sciences
Many problems in the physical sciences, machine learning, and statistical inference
necessitate sampling from a high-dimensional, multimodal probability distribution. Markov …
necessitate sampling from a high-dimensional, multimodal probability distribution. Markov …
Active learning for deep Gaussian process surrogates
Abstract Deep Gaussian processes (DGPs) are increasingly popular as predictive models in
machine learning for their nonstationary flexibility and ability to cope with abrupt regime …
machine learning for their nonstationary flexibility and ability to cope with abrupt regime …
Generalizing hamiltonian monte carlo with neural networks
D Levy, MD Hoffman, J Sohl-Dickstein - arxiv preprint arxiv:1711.09268, 2017 - arxiv.org
We present a general-purpose method to train Markov chain Monte Carlo kernels,
parameterized by deep neural networks, that converge and mix quickly to their target …
parameterized by deep neural networks, that converge and mix quickly to their target …
Eigendecompositions of transfer operators in reproducing kernel Hilbert spaces
Transfer operators such as the Perron–Frobenius or Koopman operator play an important
role in the global analysis of complex dynamical systems. The eigenfunctions of these …
role in the global analysis of complex dynamical systems. The eigenfunctions of these …
[LIBRO][B] Bayesian modeling and computation in Python
Bayesian Modeling and Computation in Python aims to help beginner Bayesian
practitioners to become intermediate modelers. It uses a hands on approach with PyMC3 …
practitioners to become intermediate modelers. It uses a hands on approach with PyMC3 …
A spectral approach to gradient estimation for implicit distributions
Recently there have been increasing interests in learning and inference with implicit
distributions (ie, distributions without tractable densities). To this end, we develop a gradient …
distributions (ie, distributions without tractable densities). To this end, we develop a gradient …
K2-ABC: Approximate Bayesian computation with kernel embeddings
M Park, W Jitkrittum… - Artificial intelligence and …, 2016 - proceedings.mlr.press
Complicated generative models often result in a situation where computing the likelihood of
observed data is intractable, while simulating from the conditional density given a parameter …
observed data is intractable, while simulating from the conditional density given a parameter …
Gradient-free Hamiltonian Monte Carlo with efficient kernel exponential families
H Strathmann, D Sejdinovic… - Advances in …, 2015 - proceedings.neurips.cc
Abstract We propose Kernel Hamiltonian Monte Carlo (KMC), a gradient-free adaptive
MCMC algorithm based on Hamiltonian Monte Carlo (HMC). On target densities where …
MCMC algorithm based on Hamiltonian Monte Carlo (HMC). On target densities where …
RKHS-SHAP: Shapley values for kernel methods
SL Chau, R Hu, J Gonzalez… - Advances in neural …, 2022 - proceedings.neurips.cc
Feature attribution for kernel methods is often heuristic and not individualised for each
prediction. To address this, we turn to the concept of Shapley values (SV), a coalition game …
prediction. To address this, we turn to the concept of Shapley values (SV), a coalition game …