Title
Model selection by bootstrap penalization for classification
Abstract
We consider the binary classification problem. Given an i.i.d. sample drawn from the distribution of an X ×{ 0, 1}−valued random pair, we propose to estimate the so-called Bayes classifier by minimizing the sum of the empirical classification error and a penalty term based on Efron's or i.i.d. weighted bootstrap samples of the data. We obtain exponential inequalities for such bootstrap type penalties, which allow us to derive non-asymptotic proper- ties for the corresponding estimators. In particular, we prove that these estimators achieve the global minimax risk over sets of functions built from Vapnik-Chervonenkis classes. The ob- tained results generalize Koltchinskii (2001) and Bartlett et al.'s (2002) ones for Rademacher penalties that can thus be seen as special examples of bootstrap type penalties. To illustrate this, we carry out an experimental study in which we compare the different methods for an intervals model selection problem.
Year
DOI
Venue
2007
10.1007/s10994-006-7679-y
Machine Learning
Keywords
Field
DocType
Model selection,Classification,Bootstrap penalty,Exponential inequality,Oracle inequality,Minimax risk
Discrete mathematics,Mathematical optimization,Minimax,Exponential function,Binary classification,Bootstrapping,Model selection,Mathematics,Bootstrapping (electronics),Bayes classifier,Estimator
Journal
Volume
Issue
ISSN
66
2-3
0885-6125
Citations 
PageRank 
References 
6
0.69
11
Authors
1
Name
Order
Citations
PageRank
Magalie Fromont1132.24