Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
User-friendly introduction to PAC-Bayes bounds
P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …
some weights, that is, to some probability distribution. Randomized predictors are obtained …
Generalization bounds: Perspectives from information theory and PAC-Bayes
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …
decades, the PAC-Bayesian approach has been established as a flexible framework to …
MMD-FUSE: Learning and combining kernels for two-sample testing without data splitting
We propose novel statistics which maximise the power of a two-sample test based on the
Maximum Mean Discrepancy (MMD), byadapting over the set of kernels used in defining it …
Maximum Mean Discrepancy (MMD), byadapting over the set of kernels used in defining it …
Statistical guarantees for variational autoencoders using pac-bayesian theory
Abstract Since their inception, Variational Autoencoders (VAEs) have become central in
machine learning. Despite their widespread use, numerous questions regarding their …
machine learning. Despite their widespread use, numerous questions regarding their …
PAC-Bayes generalisation bounds for heavy-tailed losses through supermartingales
While PAC-Bayes is now an established learning framework for light-tailed losses (\emph
{eg}, subgaussian or subexponential), its extension to the case of heavy-tailed losses …
{eg}, subgaussian or subexponential), its extension to the case of heavy-tailed losses …
Learning via Wasserstein-based high probability generalisation bounds
Minimising upper bounds on the population risk or the generalisation gap has been widely
used in structural risk minimisation (SRM)--this is in particular at the core of PAC-Bayesian …
used in structural risk minimisation (SRM)--this is in particular at the core of PAC-Bayesian …
PAC-Bayesian generalization bounds for adversarial generative models
We extend PAC-Bayesian theory to generative models and develop generalization bounds
for models based on the Wasserstein distance and the total variation distance. Our first result …
for models based on the Wasserstein distance and the total variation distance. Our first result …
Improved algorithms for stochastic linear bandits using tail bounds for martingale mixtures
We present improved algorithms with worst-case regret guarantees for the stochastic linear
bandit problem. The widely used" optimism in the face of uncertainty" principle reduces a …
bandit problem. The widely used" optimism in the face of uncertainty" principle reduces a …
Tighter generalisation bounds via interpolation
This paper contains a recipe for deriving new PAC-Bayes generalisation bounds based on
the $(f,\Gamma) $-divergence, and, in addition, presents PAC-Bayes generalisation bounds …
the $(f,\Gamma) $-divergence, and, in addition, presents PAC-Bayes generalisation bounds …
A PAC-Bayesian link between generalisation and flat minima
Modern machine learning usually involves predictors in the overparametrised setting
(number of trained parameters greater than dataset size), and their training yield not only …
(number of trained parameters greater than dataset size), and their training yield not only …