Title
Unlabeled patterns to tighten Rademacher complexity error bounds for kernel classifiers
Abstract
We derive in this work new upper bounds for estimating the generalization error of kernel classifiers, that is the misclassification rate that the models will perform on new and previously unseen data. Though this paper is more targeted towards the error estimation topic, the generalization error can be obviously exploited, in practice, for model selection purposes as well. The derived bounds are based on Rademacher complexity and result to be particularly useful when a set of unlabeled samples are available, in addition to the (labeled) training examples: we will show that, by exploiting further unlabeled patterns, the confidence term of the conventional Rademacher complexity bound can be reduced by a factor of three. Moreover, the availability of unlabeled examples allows also to obtain further improvements by building localized versions of the hypothesis class containing the optimal classifier.
Year
DOI
Venue
2014
10.1016/j.patrec.2013.04.027
Pattern Recognition Letters
Keywords
Field
DocType
rademacher complexity error bound,rademacher complexity,confidence term,hypothesis class,error estimation topic,unlabeled example,unlabeled sample,unlabeled pattern,conventional rademacher complexity,generalization error,work new upper bound,kernel classifier,structural risk minimization,model selection,support vector machine
Kernel (linear algebra),Pattern recognition,Rademacher complexity,Support vector machine,Model selection,Artificial intelligence,Generalization error,Structural risk minimization,Classifier (linguistics),Machine learning,Mathematics
Journal
Volume
ISSN
Citations 
37,
0167-8655
9
PageRank 
References 
Authors
0.45
15
4
Name
Order
Citations
PageRank
Davide Anguita1100170.58
Alessandro Ghio266735.71
Luca Oneto383063.22
Sandro Ridella4677140.62