Generalization bounds: Perspectives from information theory and PAC-Bayes
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …
decades, the PAC-Bayesian approach has been established as a flexible framework to …
Sample compression unleashed: New generalization bounds for real valued losses
The sample compression theory provides generalization guarantees for predictors that can
be fully defined using a subset of the training dataset and a (short) message string, generally …
be fully defined using a subset of the training dataset and a (short) message string, generally …
Lifted Coefficient of Determination: Fast model-free prediction intervals and likelihood-free model comparison
We propose the $\textit {lifted linear model} $, and derive model-free prediction intervals that
become tighter as the correlation between predictions and observations increases. These …
become tighter as the correlation between predictions and observations increases. These …