User-friendly introduction to PAC-Bayes bounds

P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …

Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj… - … and Trends® in …, 2025 - nowpublishers.com
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

MMD-FUSE: Learning and combining kernels for two-sample testing without data splitting

F Biggs, A Schrab, A Gretton - Advances in Neural …, 2023 - proceedings.neurips.cc
We propose novel statistics which maximise the power of a two-sample test based on the
Maximum Mean Discrepancy (MMD), byadapting over the set of kernels used in defining it …

Statistical guarantees for variational autoencoders using pac-bayesian theory

SD Mbacke, F Clerc, P Germain - Advances in Neural …, 2023 - proceedings.neurips.cc
Abstract Since their inception, Variational Autoencoders (VAEs) have become central in
machine learning. Despite their widespread use, numerous questions regarding their …

PAC-Bayes generalisation bounds for heavy-tailed losses through supermartingales

M Haddouche, B Guedj - arxiv preprint arxiv:2210.00928, 2022 - arxiv.org
While PAC-Bayes is now an established learning framework for light-tailed losses (\emph
{eg}, subgaussian or subexponential), its extension to the case of heavy-tailed losses …

Learning via Wasserstein-based high probability generalisation bounds

P Viallard, M Haddouche… - Advances in Neural …, 2023 - proceedings.neurips.cc
Minimising upper bounds on the population risk or the generalisation gap has been widely
used in structural risk minimisation (SRM)--this is in particular at the core of PAC-Bayesian …

PAC-Bayesian generalization bounds for adversarial generative models

SD Mbacke, F Clerc, P Germain - … Conference on Machine …, 2023 - proceedings.mlr.press
We extend PAC-Bayesian theory to generative models and develop generalization bounds
for models based on the Wasserstein distance and the total variation distance. Our first result …

Improved algorithms for stochastic linear bandits using tail bounds for martingale mixtures

H Flynn, D Reeb, M Kandemir… - Advances in Neural …, 2023 - proceedings.neurips.cc
We present improved algorithms with worst-case regret guarantees for the stochastic linear
bandit problem. The widely used" optimism in the face of uncertainty" principle reduces a …

Tighter generalisation bounds via interpolation

P Viallard, M Haddouche, U Şimşekli… - arxiv preprint arxiv …, 2024 - arxiv.org
This paper contains a recipe for deriving new PAC-Bayes generalisation bounds based on
the $(f,\Gamma) $-divergence, and, in addition, presents PAC-Bayes generalisation bounds …

A PAC-Bayesian link between generalisation and flat minima

M Haddouche, P Viallard, U Simsekli… - arxiv preprint arxiv …, 2024 - arxiv.org
Modern machine learning usually involves predictors in the overparametrised setting
(number of trained parameters greater than dataset size), and their training yield not only …