Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj… - … and Trends® in …, 2025 - nowpublishers.com
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

Tighter information-theoretic generalization bounds from supersamples

Z Wang, Y Mao - arxiv preprint arxiv:2302.02432, 2023 - arxiv.org
In this work, we present a variety of novel information-theoretic generalization bounds for
learning algorithms, from the supersample setting of Steinke & Zakynthinou (2020)-the …

Information complexity of stochastic convex optimization: Applications to generalization and memorization

I Attias, GK Dziugaite, M Haghifam, R Livni… - arxiv preprint arxiv …, 2024 - arxiv.org
In this work, we investigate the interplay between memorization and learning in the context
of\emph {stochastic convex optimization}(SCO). We define memorization via the information …

Bayes meets bernstein at the meta level: an analysis of fast rates in meta-learning with pac-bayes

C Riou, P Alquier, BE Chérief-Abdellatif - arxiv preprint arxiv:2302.11709, 2023 - arxiv.org
Bernstein's condition is a key assumption that guarantees fast rates in machine learning. For
example, the Gibbs algorithm with prior $\pi $ has an excess risk in $ O (d_ {\pi}/n) $, as …

Exactly tight information-theoretic generalization error bound for the quadratic gaussian problem

R Zhou, C Tian, T Liu - IEEE Journal on Selected Areas in …, 2024 - ieeexplore.ieee.org
We provide a new information-theoretic generalization error bound that is exactly tight (ie,
matching even the constant) for the canonical quadratic Gaussian (location) problem. Most …

Learning an explicit hyper-parameter prediction function conditioned on tasks

J Shu, D Meng, Z Xu - The Journal of Machine Learning Research, 2023 - dl.acm.org
Meta learning has attracted much attention recently in machine learning community.
Contrary to conventional machine learning aiming to learn inherent prediction rules to …

On the generalization error of meta learning for the Gibbs algorithm

Y Bu, HV Tetali, G Aminian… - … on Information Theory …, 2023 - ieeexplore.ieee.org
We analyze the generalization ability of joint-training meta learning algorithms via the Gibbs
algorithm. Our exact characterization of the expected meta generalization error for the meta …

More Flexible PAC-Bayesian Meta-Learning by Learning Learning Algorithms

H Zakerinia, A Behjati, CH Lampert - arxiv preprint arxiv:2402.04054, 2024 - arxiv.org
We introduce a new framework for studying meta-learning methods using PAC-Bayesian
theory. Its main advantage over previous work is that it allows for more flexibility in how the …

Error Bounds of Supervised Classification from Information-Theoretic Perspective

B Qi, W Gong, L Li - arxiv preprint arxiv:2406.04567, 2024 - arxiv.org
There remains a list of unanswered research questions on deep learning (DL), including the
remarkable generalization power of overparametrized neural networks, the efficient …

Towards Sharper Information-theoretic Generalization Bounds for Meta-Learning

W Wen, T Gong, Y Dong, YJ Liu, W Zhang - arxiv preprint arxiv …, 2025 - arxiv.org
In recent years, information-theoretic generalization bounds have emerged as a promising
approach for analyzing the generalization capabilities of meta-learning algorithms …