Title
Automatic model selection for the optimization of SVM kernels
Abstract
This approach aims to optimize the kernel parameters and to efficiently reduce the number of support vectors, so that the generalization error can be reduced drastically. The proposed methodology suggests the use of a new model selection criterion based on the estimation of the probability of error of the SVM classifier. For comparison, we considered two more model selection criteria: GACV ('Generalized Approximate Cross-Validation') and VC ('Vapnik-Chernovenkis') dimension. These criteria are algebraic estimates of upper bounds of the expected error. For the former, we also propose a new minimization scheme. The experiments conducted on a bi-class problem show that we can adequately choose the SVM hyper-parameters using the empirical error criterion. Moreover, it turns out that the criterion produces a less complex model with fewer support vectors. For multi-class data, the optimization strategy is adapted to the one-against-one data partitioning. The approach is then evaluated on images of handwritten digits from the USPS database.
Year
DOI
Venue
2005
10.1016/j.patcog.2005.03.011
Pattern Recognition
Keywords
Field
DocType
empirical error,new minimization scheme,gacv.,empirical error criterion,automatic model selection,svm,model selection criterion,fewer support vector,kernel,generalization error,model selection,svm classifier,complex model,expected error,svm kernel,vc,multi-class data,new model selection criterion,probability of error,support vector,upper bound
Kernel (linear algebra),Algebraic number,Pattern recognition,Support vector machine,Model selection,Minification,Artificial intelligence,Svm classifier,Probability of error,Data partitioning,Mathematics,Machine learning
Journal
Volume
Issue
ISSN
38
10
Pattern Recognition
Citations 
PageRank 
References 
52
2.38
25
Authors
3
Name
Order
Citations
PageRank
Ayat, N.E.1855.14
Mohamed Cheriet22047238.58
Ching Y. Suen375691127.54