User-friendly introduction to PAC-Bayes bounds

P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …

Wireless network intelligence at the edge

J Park, S Samarakoon, M Bennis… - Proceedings of the …, 2019 - ieeexplore.ieee.org
Fueled by the availability of more data and computing power, recent breakthroughs in cloud-
based machine learning (ML) have transformed every aspect of our lives from face …

A vector-contraction inequality for rademacher complexities

A Maurer - … Learning Theory: 27th International Conference, ALT …, 2016 - Springer
The contraction inequality for Rademacher averages is extended to Lipschitz functions with
vector-valued domains, and it is also shown that in the bounding expression the …

Gaussian processes in machine learning

CE Rasmussen - Summer school on machine learning, 2003 - Springer
We give a basic introduction to Gaussian Process regression models. We focus on
understanding the role of the stochastic process and how it is used to define a distribution …

The ssl interplay: Augmentations, inductive bias, and generalization

V Cabannes, B Kiani, R Balestriero… - International …, 2023 - proceedings.mlr.press
Self-supervised learning (SSL) has emerged as a powerful framework to learn
representations from raw data without supervision. Yet in practice, engineers face issues …

[PDF][PDF] A gentle tutorial of the EM algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models

JA Bilmes - International computer science institute, 1998 - leap.ee.iisc.ac.in
We describe the maximum-likelihood parameter estimation problem and how the
Expectation-Maximization (EM) algorithm can be used for its solution. We first describe the …

[KNIHA][B] Learning theory from first principles

F Bach - 2024 - books.google.com
A comprehensive and cutting-edge introduction to the foundations and modern applications
of learning theory. Research has exploded in the field of machine learning resulting in …

Sparse multinomial logistic regression: Fast algorithms and generalization bounds

B Krishnapuram, L Carin… - IEEE transactions on …, 2005 - ieeexplore.ieee.org
Recently developed methods for learning sparse classifiers are among the state-of-the-art in
supervised learning. These methods learn classifiers that incorporate weighted sums of …

On the generalization ability of on-line learning algorithms

N Cesa-Bianchi, A Conconi… - IEEE Transactions on …, 2004 - ieeexplore.ieee.org
In this paper, it is shown how to extract a hypothesis with small risk from the ensemble of
hypotheses generated by an arbitrary on-line learning algorithm run on an independent and …

PAC-Bayesian theory meets Bayesian inference

P Germain, F Bach, A Lacoste… - Advances in Neural …, 2016 - proceedings.neurips.cc
We exhibit a strong link between frequentist PAC-Bayesian bounds and the Bayesian
marginal likelihood. That is, for the negative log-likelihood loss function, we show that the …