When Gaussian process meets big data: A review of scalable GPs

H Liu, YS Ong, X Shen, J Cai - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
The vast quantity of information brought by big data as well as the evolving computer
hardware encourages success stories in the machine learning community. In the …

Advances in variational inference

C Zhang, J Bütepage, H Kjellström… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
Many modern unsupervised or semi-supervised machine learning algorithms rely on
Bayesian probabilistic models. These models are usually intractable and thus require …

Virtual adversarial training: a regularization method for supervised and semi-supervised learning

T Miyato, S Maeda, M Koyama… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
We propose a new regularization method based on virtual adversarial loss: a new measure
of local smoothness of the conditional label distribution given input. Virtual adversarial loss …

[KNIHA][B] Mathematics for machine learning

MP Deisenroth, AA Faisal, CS Ong - 2020 - books.google.com
The fundamental mathematical tools needed to understand machine learning include linear
algebra, analytic geometry, matrix decompositions, vector calculus, optimization, probability …

FedLoc: Federated learning framework for data-driven cooperative localization and location data processing

F Yin, Z Lin, Q Kong, Y Xu, D Li… - IEEE Open Journal …, 2020 - ieeexplore.ieee.org
In this overview paper, data-driven learning model-based cooperative localization and
location data processing are considered, in line with the emerging machine learning and big …

Scalable variational Gaussian process classification

J Hensman, A Matthews… - Artificial intelligence and …, 2015 - proceedings.mlr.press
Gaussian process classification is a popular method with a number of appealing properties.
We show how to scale the model within a variational inducing point framework, out …

Stochastic variational deep kernel learning

AG Wilson, Z Hu… - Advances in neural …, 2016 - proceedings.neurips.cc
Deep kernel learning combines the non-parametric flexibility of kernel methods with the
inductive biases of deep learning architectures. We propose a novel deep kernel learning …

Distributed gaussian processes

M Deisenroth, JW Ng - International conference on machine …, 2015 - proceedings.mlr.press
Abstract To scale Gaussian processes (GPs) to large data sets we introduce the robust
Bayesian Committee Machine (rBCM), a practical and scalable product-of-experts model for …

Gaussian process prior variational autoencoders

FP Casale, A Dalca, L Saglietti… - Advances in neural …, 2018 - proceedings.neurips.cc
Variational autoencoders (VAE) are a powerful and widely-used class of models to learn
complex data distributions in an unsupervised fashion. One important limitation of VAEs is …

Sparse Gaussian process regression for multi-step ahead forecasting of wind gusts combining numerical weather predictions and on-site measurements

H Wang, YM Zhang, JX Mao - Journal of Wind Engineering and Industrial …, 2022 - Elsevier
Accurate forecasts of wind gusts are crucially important for wind power generation, severe
weather warnings, and the regulation of vehicle speed. To improve the short-term and long …