Generalization bounds: Perspectives from information theory and PAC-Bayes
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …
decades, the PAC-Bayesian approach has been established as a flexible framework to …
Sample-conditioned hypothesis stability sharpens information-theoretic generalization bounds
We present new information-theoretic generalization guarantees through the a novel
construction of the" neighboring-hypothesis" matrix and a new family of stability notions …
construction of the" neighboring-hypothesis" matrix and a new family of stability notions …
Minimum description length and generalization guarantees for representation learning
M Sefidgaran, A Zaidi… - Advances in Neural …, 2024 - proceedings.neurips.cc
A major challenge in designing efficient statistical supervised learning algorithms is finding
representations that perform well not only on available training samples but also on unseen …
representations that perform well not only on available training samples but also on unseen …
Information-theoretic generalization bounds for learning from quantum data
Learning tasks play an increasingly prominent role in quantum information and computation.
They range from fundamental problems such as state discrimination and metrology over the …
They range from fundamental problems such as state discrimination and metrology over the …
Tighter information-theoretic generalization bounds from supersamples
In this work, we present a variety of novel information-theoretic generalization bounds for
learning algorithms, from the supersample setting of Steinke & Zakynthinou (2020)-the …
learning algorithms, from the supersample setting of Steinke & Zakynthinou (2020)-the …
An information-theoretic approach to generalization theory
B Rodríguez-Gálvez, R Thobaben… - arxiv preprint arxiv …, 2024 - arxiv.org
We investigate the in-distribution generalization of machine learning algorithms. We depart
from traditional complexity-based approaches by analyzing information-theoretic bounds …
from traditional complexity-based approaches by analyzing information-theoretic bounds …
More PAC-Bayes bounds: From bounded losses, to losses with general tail behaviors, to anytime-validity
B Rodríguez-Gálvez, R Thobaben… - arxiv preprint arxiv …, 2023 - arxiv.org
In this paper, we present new high-probability PAC-Bayes bounds for different types of
losses. Firstly, for losses with a bounded range, we present a strengthened version of …
losses. Firstly, for losses with a bounded range, we present a strengthened version of …
More PAC-Bayes bounds: From bounded losses, to losses with general tail behaviors, to anytime validity
B Rodríguez-Gálvez, R Thobaben… - Journal of Machine …, 2024 - jmlr.org
In this paper, we present new high-probability PAC-Bayes bounds for different types of
losses. Firstly, for losses with a bounded range, we recover a strengthened version of …
losses. Firstly, for losses with a bounded range, we recover a strengthened version of …
Information-Theoretic Generalization Bounds for Transductive Learning and its Applications
H Tang, Y Liu - arxiv preprint arxiv:2311.04561, 2023 - arxiv.org
In this paper, we develop data-dependent and algorithm-dependent generalization bounds
for transductive learning algorithms in the context of information theory for the first time. We …
for transductive learning algorithms in the context of information theory for the first time. We …
Comparing Comparators in Generalization Bounds
F Hellström, B Guedj - International Conference on Artificial …, 2024 - proceedings.mlr.press
We derive generic information-theoretic and PAC-Bayesian generalization bounds involving
an arbitrary convex comparator function, which measures the discrepancy between the …
an arbitrary convex comparator function, which measures the discrepancy between the …