Title
Support vector machines and regularization
Abstract
Recently, there has been a growing interest in Statistical Learning Theory, aka VC theory, due to many successful applications of Support Vector Machines (SVMs). Even though most theoretical results in VC-theory (including all main concepts underlying SVM methodology) have been developed over 25 years ago, these concepts are occasionally misunderstood in the research community. This paper compares standard SVM regression and the regularization for learning dependencies from data. We point out that SVM approach has been developed in VC-theory under risk minimization approach, whereas the regularization approach has been developed under function approximation setting. This distinction is especially important since regularization-based learning is often presented as a purely constructive methodology (with no clearly stated problem setting), even though original regularization theory has been introduced under clearly stated function approximation setting. Further, we present empirical comparisons illustrating the effect of different mechanisms for complexity control (i.e., epsilon-insensitive loss vs standard ridge regression) on the generalization performance, under very simple settings using synthetic data sets. These comparisons suggest that the SVM approach to complexity control (via epsilon-loss) is more appropriate for learning under sparse high-dimensional settings.
Year
Venue
Keywords
2005
Seventh IASTED International Conference on Signal and Image Processing
function approximation,regularization,structural risk minimization
Field
DocType
Citations 
Function approximation,Least squares support vector machine,Computer science,Support vector machine,Algorithm,Regularization (mathematics),Artificial intelligence,Relevance vector machine,Structural risk minimization,Machine learning,Regularization perspectives on support vector machines
Conference
0
PageRank 
References 
Authors
0.34
1
2
Name
Order
Citations
PageRank
Vladimir Cherkassky11064126.66
Yunqian Ma253344.21