Title
Functional classification with margin conditions
Abstract
Let (X,Y) be a $\mathcal{X}$× 0,1 valued random pair and consider a sample (X1,Y1),...,(Xn,Yn) drawn from the distribution of (X,Y). We aim at constructing from this sample a classifier that is a function which would predict the value of Y from the observation of X. The special case where $\mathcal{X}$is a functional space is of particular interest due to the so called curse of dimensionality. In a recent paper, Biau et al. [1] propose to filter the Xi’s in the Fourier basis and to apply the classical k–Nearest Neighbor rule to the first d coefficients of the expansion. The selection of both k and d is made automatically via a penalized criterion. We extend this study, and note here the penalty used by Biau et al. is too heavy when we consider the minimax point of view under some margin type assumptions. We prove that using a penalty of smaller order or equal to zero is preferable both in theory and practice. Our experimental study furthermore shows that the introduction of a small-order penalty stabilizes the selection process, while preserving rather good performances.
Year
DOI
Venue
2006
10.1007/11776420_10
COLT
Keywords
Field
DocType
small-order penalty,selection process,classical k,margin condition,margin type assumption,nearest neighbor rule,minimax point,functional classification,functional space,good performance,experimental study,fourier basis,function space,k nearest neighbor,curse of dimensionality
Convergence (routing),Minimax,Artificial intelligence,Basis function,Classifier (linguistics),Special case,Discrete mathematics,Mathematical optimization,Regression,Curse of dimensionality,Machine learning,Mathematics,Minimax problem
Conference
Volume
ISSN
ISBN
4005
0302-9743
3-540-35294-5
Citations 
PageRank 
References 
4
0.44
8
Authors
2
Name
Order
Citations
PageRank
Magalie Fromont1132.24
Christine Tuleau240.44