[BOOK][B] Combining pattern classifiers: methods and algorithms
LI Kuncheva - 2014 - books.google.com
A unified, coherent treatment of current classifier ensemble methods, from fundamentals of
pattern recognition to ensemble feature selection, now in its second edition The art and …
pattern recognition to ensemble feature selection, now in its second edition The art and …
[PDF][PDF] Rademacher and Gaussian complexities: Risk bounds and structural results
We investigate the use of certain data-dependent estimates of the complexity of a function
class, called Rademacher and Gaussian complexities. In a decision theoretic setting, we …
class, called Rademacher and Gaussian complexities. In a decision theoretic setting, we …
Rotation forest: A new classifier ensemble method
We propose a method for generating classifier ensembles based on feature extraction. To
create the training data for a base classifier, the feature set is randomly split into K subsets …
create the training data for a base classifier, the feature set is randomly split into K subsets …
Boosting algorithms as gradient descent
We provide an abstract characterization of boosting algorithms as gradient decsent on cost-
functionals in an inner-product function space. We prove convergence of these functional …
functionals in an inner-product function space. We prove convergence of these functional …
Soft margins for AdaBoost
Recently ensemble methods like ADABOOST have been applied successfully in many
problems, while seemingly defying the problems of overfitting. ADABOOST rarely overfits in …
problems, while seemingly defying the problems of overfitting. ADABOOST rarely overfits in …
Advance and prospects of AdaBoost algorithm
C Ying, M Qi-Guang, L Jia-Chen, G Lin - Acta Automatica Sinica, 2013 - Elsevier
AdaBoost is one of the most excellent Boosting algorithms. It has a solid theoretical basis
and has made great success in practical applications. AdaBoost can boost a weak learning …
and has made great success in practical applications. AdaBoost can boost a weak learning …
Empirical margin distributions and bounding the generalization error of combined classifiers
V Koltchinskii, D Panchenko - The Annals of Statistics, 2002 - projecteuclid.org
We prove new probabilistic upper bounds on generalization error of complex classifiers that
are combinations of simple classifiers. Such combinations could be implemented by neural …
are combinations of simple classifiers. Such combinations could be implemented by neural …
[PDF][PDF] Large scale transductive SVMS.
We show how the concave-convex procedure can be applied to transductive SVMs, which
traditionally require solving a combinatorial search problem. This provides for the first time a …
traditionally require solving a combinatorial search problem. This provides for the first time a …
An analysis of diversity measures
Diversity among the base classifiers is deemed to be important when constructing a
classifier ensemble. Numerous algorithms have been proposed to construct a good …
classifier ensemble. Numerous algorithms have been proposed to construct a good …
An introduction to boosting and leveraging
We provide an introduction to theoretical and practical aspects of Boosting and Ensemble
learning, providing a useful reference for researchers in the field of Boosting as well as for …
learning, providing a useful reference for researchers in the field of Boosting as well as for …