User-friendly introduction to PAC-Bayes bounds

P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …

Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj… - … and Trends® in …, 2025 - nowpublishers.com
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

A primer on PAC-Bayesian learning

B Guedj - arxiv preprint arxiv:1901.05353, 2019 - arxiv.org
Generalised Bayesian learning algorithms are increasingly popular in machine learning,
due to their PAC generalisation properties and flexibility. The present paper aims at …

A unified recipe for deriving (time-uniform) PAC-Bayes bounds

B Chugg, H Wang, A Ramdas - Journal of Machine Learning Research, 2023 - jmlr.org
We present a unified framework for deriving PAC-Bayesian generalization bounds. Unlike
most previous literature on this topic, our bounds are anytime-valid (ie, time-uniform) …

Tighter risk certificates for neural networks

M Pérez-Ortiz, O Rivasplata, J Shawe-Taylor… - Journal of Machine …, 2021 - jmlr.org
This paper presents an empirical study regarding training probabilistic neural networks
using training objectives derived from PAC-Bayes bounds. In the context of probabilistic …

Enhancing adversarial training with second-order statistics of weights

G **, X Yi, W Huang, S Schewe… - Proceedings of the …, 2022 - openaccess.thecvf.com
Adversarial training has been shown to be one of the most effective approaches to improve
the robustness of deep neural networks. It is formalized as a min-max optimization over …

A survey on domain adaptation theory: learning bounds and theoretical guarantees

I Redko, E Morvant, A Habrard, M Sebban… - arxiv preprint arxiv …, 2020 - arxiv.org
All famous machine learning algorithms that comprise both supervised and semi-supervised
learning work well only under a common assumption: the training and test data follow the …

Uncertainty estimation by fisher information-based evidential deep learning

D Deng, G Chen, Y Yu, F Liu… - … Conference on Machine …, 2023 - proceedings.mlr.press
Uncertainty estimation is a key factor that makes deep learning reliable in practical
applications. Recently proposed evidential neural networks explicitly account for different …

Curriculum learning of multiple tasks

A Pentina, V Sharmanska… - Proceedings of the …, 2015 - openaccess.thecvf.com
Sharing information between multiple tasks enables algorithms to achieve good
generalization performance even from small amounts of training data. However, in a realistic …

PAC-Bayesian theory meets Bayesian inference

P Germain, F Bach, A Lacoste… - Advances in Neural …, 2016 - proceedings.neurips.cc
We exhibit a strong link between frequentist PAC-Bayesian bounds and the Bayesian
marginal likelihood. That is, for the negative log-likelihood loss function, we show that the …