Title
Selection of Support Vector Kernel Parameters for Improved Generalization
Abstract
The selection of kernel parameters is an open problem in the training of nonlinear support vector machines. The usual selection criterion is the quotient of the radius of the smallest sphere enclosing the training features and the margin width. Empirical studies on real-world data using Gaussian and polynomial kernels show that the test error due to this criterion is often much larger than the minimum test error. In other words, this criterion can be suboptimal or inadequate. Hence, we propose augmenting the usual criterion with a traditional measure of class separability in statistical feature selection. This measure employs the within-class and between- class scatter in feature space, which is equivalent to computing the pooled covariance matrix trace and the distance between class means. We show empirically that the new criterion results in improved generalization.
Year
Venue
Keywords
2000
ICML
improved generalization,support vector kernel parameters,support vector,empirical study,feature selection,feature space,covariance matrix,support vector machine
Field
DocType
ISBN
Graph kernel,Structured support vector machine,Least squares support vector machine,Radial basis function kernel,Pattern recognition,Computer science,Support vector machine,Polynomial kernel,Artificial intelligence,Relevance vector machine,Kernel method,Machine learning
Conference
1-55860-707-2
Citations 
PageRank 
References 
0
0.34
4
Authors
2
Name
Order
Citations
PageRank
Loo-Nin Teow110317.29
Kia-Fock Loe218020.88