When Gaussian process meets big data: A review of scalable GPs
The vast quantity of information brought by big data as well as the evolving computer
hardware encourages success stories in the machine learning community. In the …
hardware encourages success stories in the machine learning community. In the …
Advances in variational inference
Many modern unsupervised or semi-supervised machine learning algorithms rely on
Bayesian probabilistic models. These models are usually intractable and thus require …
Bayesian probabilistic models. These models are usually intractable and thus require …
Virtual adversarial training: a regularization method for supervised and semi-supervised learning
We propose a new regularization method based on virtual adversarial loss: a new measure
of local smoothness of the conditional label distribution given input. Virtual adversarial loss …
of local smoothness of the conditional label distribution given input. Virtual adversarial loss …
Constrained Bayesian optimization for automatic chemical design using variational autoencoders
Automatic Chemical Design is a framework for generating novel molecules with optimized
properties. The original scheme, featuring Bayesian optimization over the latent space of a …
properties. The original scheme, featuring Bayesian optimization over the latent space of a …
Understanding probabilistic sparse Gaussian process approximations
Good sparse approximations are essential for practical inference in Gaussian Processes as
the computational cost of exact methods is prohibitive for large datasets. The Fully …
the computational cost of exact methods is prohibitive for large datasets. The Fully …
Deep Gaussian processes for regression using approximate expectation propagation
Abstract Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of
Gaussian processes (GPs) and are formally equivalent to neural networks with multiple …
Gaussian processes (GPs) and are formally equivalent to neural networks with multiple …
Functional regularisation for continual learning with gaussian processes
We introduce a framework for Continual Learning (CL) based on Bayesian inference over
the function space rather than the parameters of a deep neural network. This method …
the function space rather than the parameters of a deep neural network. This method …
Adversarial examples, uncertainty, and transfer testing robustness in gaussian process hybrid deep networks
Deep neural networks (DNNs) have excellent representative power and are state of the art
classifiers on many tasks. However, they often do not capture their own uncertainties well …
classifiers on many tasks. However, they often do not capture their own uncertainties well …
Hilbert space methods for reduced-rank Gaussian process regression
This paper proposes a novel scheme for reduced-rank Gaussian process regression. The
method is based on an approximate series expansion of the covariance function in terms of …
method is based on an approximate series expansion of the covariance function in terms of …
Convolutional gaussian processes
We present a practical way of introducing convolutional structure into Gaussian processes,
making them more suited to high-dimensional inputs like images. The main contribution of …
making them more suited to high-dimensional inputs like images. The main contribution of …