When Gaussian process meets big data: A review of scalable GPs

H Liu, YS Ong, X Shen, J Cai - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
The vast quantity of information brought by big data as well as the evolving computer
hardware encourages success stories in the machine learning community. In the …

Advances in variational inference

C Zhang, J Bütepage, H Kjellström… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
Many modern unsupervised or semi-supervised machine learning algorithms rely on
Bayesian probabilistic models. These models are usually intractable and thus require …

Virtual adversarial training: a regularization method for supervised and semi-supervised learning

T Miyato, S Maeda, M Koyama… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
We propose a new regularization method based on virtual adversarial loss: a new measure
of local smoothness of the conditional label distribution given input. Virtual adversarial loss …

Constrained Bayesian optimization for automatic chemical design using variational autoencoders

RR Griffiths, JM Hernández-Lobato - Chemical science, 2020 - pubs.rsc.org
Automatic Chemical Design is a framework for generating novel molecules with optimized
properties. The original scheme, featuring Bayesian optimization over the latent space of a …

Understanding probabilistic sparse Gaussian process approximations

M Bauer, M Van der Wilk… - Advances in neural …, 2016 - proceedings.neurips.cc
Good sparse approximations are essential for practical inference in Gaussian Processes as
the computational cost of exact methods is prohibitive for large datasets. The Fully …

Deep Gaussian processes for regression using approximate expectation propagation

T Bui, D Hernández-Lobato… - International …, 2016 - proceedings.mlr.press
Abstract Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of
Gaussian processes (GPs) and are formally equivalent to neural networks with multiple …

Functional regularisation for continual learning with gaussian processes

MK Titsias, J Schwarz, AGG Matthews… - arxiv preprint arxiv …, 2019 - arxiv.org
We introduce a framework for Continual Learning (CL) based on Bayesian inference over
the function space rather than the parameters of a deep neural network. This method …

Adversarial examples, uncertainty, and transfer testing robustness in gaussian process hybrid deep networks

J Bradshaw, AGG Matthews, Z Ghahramani - arxiv preprint arxiv …, 2017 - arxiv.org
Deep neural networks (DNNs) have excellent representative power and are state of the art
classifiers on many tasks. However, they often do not capture their own uncertainties well …

Hilbert space methods for reduced-rank Gaussian process regression

A Solin, S Särkkä - Statistics and Computing, 2020 - Springer
This paper proposes a novel scheme for reduced-rank Gaussian process regression. The
method is based on an approximate series expansion of the covariance function in terms of …

Convolutional gaussian processes

M Van der Wilk, CE Rasmussen… - Advances in neural …, 2017 - proceedings.neurips.cc
We present a practical way of introducing convolutional structure into Gaussian processes,
making them more suited to high-dimensional inputs like images. The main contribution of …