[BOOK][B] Combining pattern classifiers: methods and algorithms

LI Kuncheva - 2014 - books.google.com
A unified, coherent treatment of current classifier ensemble methods, from fundamentals of
pattern recognition to ensemble feature selection, now in its second edition The art and …

[PDF][PDF] Rademacher and Gaussian complexities: Risk bounds and structural results

PL Bartlett, S Mendelson - Journal of Machine Learning Research, 2002 - jmlr.org
We investigate the use of certain data-dependent estimates of the complexity of a function
class, called Rademacher and Gaussian complexities. In a decision theoretic setting, we …

Rotation forest: A new classifier ensemble method

JJ Rodriguez, LI Kuncheva… - IEEE transactions on …, 2006 - ieeexplore.ieee.org
We propose a method for generating classifier ensembles based on feature extraction. To
create the training data for a base classifier, the feature set is randomly split into K subsets …

Boosting algorithms as gradient descent

L Mason, J Baxter, P Bartlett… - Advances in neural …, 1999 - proceedings.neurips.cc
We provide an abstract characterization of boosting algorithms as gradient decsent on cost-
functionals in an inner-product function space. We prove convergence of these functional …

Soft margins for AdaBoost

G Rätsch, T Onoda, KR Müller - Machine learning, 2001 - Springer
Recently ensemble methods like ADABOOST have been applied successfully in many
problems, while seemingly defying the problems of overfitting. ADABOOST rarely overfits in …

Advance and prospects of AdaBoost algorithm

C Ying, M Qi-Guang, L Jia-Chen, G Lin - Acta Automatica Sinica, 2013 - Elsevier
AdaBoost is one of the most excellent Boosting algorithms. It has a solid theoretical basis
and has made great success in practical applications. AdaBoost can boost a weak learning …

Empirical margin distributions and bounding the generalization error of combined classifiers

V Koltchinskii, D Panchenko - The Annals of Statistics, 2002 - projecteuclid.org
We prove new probabilistic upper bounds on generalization error of complex classifiers that
are combinations of simple classifiers. Such combinations could be implemented by neural …

[PDF][PDF] Large scale transductive SVMS.

R Collobert, F Sinz, J Weston, L Bottou… - Journal of Machine …, 2006 - jmlr.org
We show how the concave-convex procedure can be applied to transductive SVMs, which
traditionally require solving a combinatorial search problem. This provides for the first time a …

An analysis of diversity measures

EK Tang, PN Suganthan, X Yao - Machine learning, 2006 - Springer
Diversity among the base classifiers is deemed to be important when constructing a
classifier ensemble. Numerous algorithms have been proposed to construct a good …

An introduction to boosting and leveraging

R Meir, G Rätsch - Advanced Lectures on Machine Learning: Machine …, 2003 - Springer
We provide an introduction to theoretical and practical aspects of Boosting and Ensemble
learning, providing a useful reference for researchers in the field of Boosting as well as for …