[BOOK][B] Oracle inequalities in empirical risk minimization and sparse recovery problems: École D'Été de Probabilités de Saint-Flour XXXVIII-2008
V Koltchinskii - 2011 - books.google.com
The purpose of these lecture notes is to provide an introduction to the general theory of
empirical risk minimization with an emphasis on excess risk bounds and oracle inequalities …
empirical risk minimization with an emphasis on excess risk bounds and oracle inequalities …
Learning with square loss: Localization through offset rademacher complexity
We consider regression with square loss and general classes of functions without the
boundedness assumption. We introduce a notion of offset Rademacher complexity that …
boundedness assumption. We introduce a notion of offset Rademacher complexity that …
Exponential savings in agnostic active learning through abstention
We show that in pool-based active classification without assumptions on the underlying
distribution, if the learner is given the power to abstain from some predictions by paying the …
distribution, if the learner is given the power to abstain from some predictions by paying the …
Empirical entropy, minimax regret and minimax risk
A Rakhlin, K Sridharan, AB Tsybakov - 2017 - projecteuclid.org
We consider the random design regression model with square loss. We propose a method
that aggregates empirical minimizers (ERM) over appropriately chosen random subsets and …
that aggregates empirical minimizers (ERM) over appropriately chosen random subsets and …
Optimal learning with Bernstein online aggregation
O Wintenberger - Machine Learning, 2017 - Springer
We introduce a new recursive aggregation procedure called Bernstein Online Aggregation
(BOA). Its exponential weights include a second order refinement. The procedure is optimal …
(BOA). Its exponential weights include a second order refinement. The procedure is optimal …
Kullback–Leibler aggregation and misspecified generalized linear models
P Rigollet - 2012 - projecteuclid.org
Minimax lower bounds. Under some convexity and tail conditions, we prove minimax lower
bounds for the three problems of Kullback–Leibler aggregation: model selection, linear and …
bounds for the three problems of Kullback–Leibler aggregation: model selection, linear and …
Distribution-free robust linear regression
We study random design linear regression with no assumptions on the distribution of the
covariates and with a heavy-tailed response variable. In this distribution-free regression …
covariates and with a heavy-tailed response variable. In this distribution-free regression …
Deviation optimal learning using greedy -aggregation
D Dai, P Rigollet, T Zhang - 2012 - projecteuclid.org
Given a finite family of functions, the goal of model selection aggregation is to construct a
procedure that mimics the function from this family that is the closest to an unknown …
procedure that mimics the function from this family that is the closest to an unknown …
Fast rates with high probability in exp-concave statistical learning
N Mehta - Artificial Intelligence and Statistics, 2017 - proceedings.mlr.press
We present an algorithm for the statistical learning setting with a bounded exp-concave loss
in d dimensions that obtains excess risk $ O (d\log (1/δ)/n) $ with probability $1-δ $. The core …
in d dimensions that obtains excess risk $ O (d\log (1/δ)/n) $ with probability $1-δ $. The core …
Local Risk Bounds for Statistical Aggregation
In the problem of aggregation, the aim is to combine a given class of base predictors to
achieve predictions nearly as accurate as the best one. In this flexible framework, no …
achieve predictions nearly as accurate as the best one. In this flexible framework, no …