User-friendly introduction to PAC-Bayes bounds
P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …
some weights, that is, to some probability distribution. Randomized predictors are obtained …
Generalization bounds: Perspectives from information theory and PAC-Bayes
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …
decades, the PAC-Bayesian approach has been established as a flexible framework to …
A primer on PAC-Bayesian learning
B Guedj - arxiv preprint arxiv:1901.05353, 2019 - arxiv.org
Generalised Bayesian learning algorithms are increasingly popular in machine learning,
due to their PAC generalisation properties and flexibility. The present paper aims at …
due to their PAC generalisation properties and flexibility. The present paper aims at …
A unified recipe for deriving (time-uniform) PAC-Bayes bounds
We present a unified framework for deriving PAC-Bayesian generalization bounds. Unlike
most previous literature on this topic, our bounds are anytime-valid (ie, time-uniform) …
most previous literature on this topic, our bounds are anytime-valid (ie, time-uniform) …
On the properties of variational approximations of Gibbs posteriors
The PAC-Bayesian approach is a powerful set of techniques to derive nonasymptotic risk
bounds for random estimators. The corresponding optimal distribution of estimators, usually …
bounds for random estimators. The corresponding optimal distribution of estimators, usually …
Online pac-bayes learning
Most PAC-Bayesian bounds hold in the batch learning setting where data is collected at
once, prior to inference or prediction. This somewhat departs from many contemporary …
once, prior to inference or prediction. This somewhat departs from many contemporary …
PAC-Bayes generalisation bounds for heavy-tailed losses through supermartingales
While PAC-Bayes is now an established learning framework for light-tailed losses (\emph
{eg}, subgaussian or subexponential), its extension to the case of heavy-tailed losses …
{eg}, subgaussian or subexponential), its extension to the case of heavy-tailed losses …
Bayesian fractional posteriors
Bayesian fractional posteriors Page 1 The Annals of Statistics 2019, Vol. 47, No. 1, 39–66
https://doi.org/10.1214/18-AOS1712 © Institute of Mathematical Statistics, 2019 BAYESIAN …
https://doi.org/10.1214/18-AOS1712 © Institute of Mathematical Statistics, 2019 BAYESIAN …
On PAC-Bayesian reconstruction guarantees for VAEs
Despite its wide use and empirical successes, the theoretical understanding and study of the
behaviour and performance of the variational autoencoder (VAE) have only emerged in the …
behaviour and performance of the variational autoencoder (VAE) have only emerged in the …
Simpler PAC-Bayesian bounds for hostile data
PAC-Bayesian learning bounds are of the utmost interest to the learning community. Their
role is to connect the generalization ability of an aggregation distribution ρ ρ to its empirical …
role is to connect the generalization ability of an aggregation distribution ρ ρ to its empirical …