Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj… - … and Trends® in …, 2025 - nowpublishers.com
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

Bayes meets Bernstein at the meta level: an analysis of fast rates in meta-learning with PAC-Bayes

C Riou, P Alquier, BE Chérief-Abdellatif - arxiv preprint arxiv:2302.11709, 2023 - arxiv.org
Bernstein's condition is a key assumption that guarantees fast rates in machine learning. For
example, the Gibbs algorithm with prior $\pi $ has an excess risk in $ O (d_ {\pi}/n) $, as …

Exactly tight information-theoretic generalization error bound for the quadratic gaussian problem

R Zhou, C Tian, T Liu - IEEE Journal on Selected Areas in …, 2024 - ieeexplore.ieee.org
We provide a new information-theoretic generalization error bound that is exactly tight (ie,
matching even the constant) for the canonical quadratic Gaussian (location) problem. Most …

Learning an explicit hyper-parameter prediction function conditioned on tasks

J Shu, D Meng, Z Xu - Journal of machine learning research, 2023 - jmlr.org
Meta learning has attracted much attention recently in machine learning community.
Contrary to conventional machine learning aiming to learn inherent prediction rules to …

More flexible pac-bayesian meta-learning by learning learning algorithms

H Zakerinia, A Behjati, CH Lampert - arxiv preprint arxiv:2402.04054, 2024 - arxiv.org
We introduce a new framework for studying meta-learning methods using PAC-Bayesian
theory. Its main advantage over previous work is that it allows for more flexibility in how the …