Title
An Optimal Basis For Feature Extraction With Support Vector Machine Classification Using The Radius-Margin Bound
Abstract
A method is presented for deriving an optimal basis for features classified with a support vector machine. The method is based on minimizing the leave-one-out error which is approximated by the radius-margin bound. A gradient descent method provides a learning rule for the basis in an outer loop of an iteration. The inner loop performs support vector machine training and provides support vector coefficients on which the gradient descent depends. In this way, the derivation of a basis for feature extraction and the support vector machine are jointly optimized. The efficacy of the method is illustrated with examples from multi-dimensional synthetic data sets.
Year
Venue
DocType
2006
2006 IEEE International Conference on Acoustics, Speech and Signal Processing, Vols 1-13
Conference
ISSN
Citations 
PageRank 
1520-6149
0
0.34
References 
Authors
9
2
Name
Order
Citations
PageRank
Jeff Fortuna100.68
David W. Capson220729.98