User-friendly introduction to PAC-Bayes bounds

P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …

Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj… - … and Trends® in …, 2025 - nowpublishers.com
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

A unified recipe for deriving (time-uniform) PAC-Bayes bounds

B Chugg, H Wang, A Ramdas - Journal of Machine Learning Research, 2023 - jmlr.org
We present a unified framework for deriving PAC-Bayesian generalization bounds. Unlike
most previous literature on this topic, our bounds are anytime-valid (ie, time-uniform) …

Non-vacuous generalization bounds for large language models

S Lotfi, M Finzi, Y Kuang, TGJ Rudner… - arxiv preprint arxiv …, 2023 - arxiv.org
Modern language models can contain billions of parameters, raising the question of whether
they can generalize beyond the training data or simply parrot their training corpora. We …

Mission impossible: A statistical perspective on jailbreaking llms

J Su, J Kempe, K Ullrich - Advances in Neural Information …, 2025 - proceedings.neurips.cc
Large language models (LLMs) are trained on a deluge of text data with limited quality
control. As a result, LLMs can exhibit unintended or even harmful behaviours, such as …

PAC-Bayes generalisation bounds for heavy-tailed losses through supermartingales

M Haddouche, B Guedj - arxiv preprint arxiv:2210.00928, 2022 - arxiv.org
While PAC-Bayes is now an established learning framework for light-tailed losses (\emph
{eg}, subgaussian or subexponential), its extension to the case of heavy-tailed losses …

Generalization bounds for meta-learning via pac-bayes and uniform stability

A Farid, A Majumdar - Advances in neural information …, 2021 - proceedings.neurips.cc
We are motivated by the problem of providing strong generalization guarantees in the
context of meta-learning. Existing generalization bounds are either challenging to evaluate …

A new family of generalization bounds using samplewise evaluated CMI

F Hellström, G Durisi - Advances in Neural Information …, 2022 - proceedings.neurips.cc
We present a new family of information-theoretic generalization bounds, in which the
training loss and the population loss are compared through a jointly convex function. This …

Learning via Wasserstein-based high probability generalisation bounds

P Viallard, M Haddouche… - Advances in Neural …, 2023 - proceedings.neurips.cc
Minimising upper bounds on the population risk or the generalisation gap has been widely
used in structural risk minimisation (SRM)--this is in particular at the core of PAC-Bayesian …

Generalization analysis of machine learning algorithms via the worst-case data-generating probability measure

X Zou, SM Perlaza, I Esnaola, E Altman - Proceedings of the AAAI …, 2024 - ojs.aaai.org
In this paper, the worst-case probability measure over the data is introduced as a tool for
characterizing the generalization capabilities of machine learning algorithms. More …