Title
A Convex Approach to Validation-Based Learning of the Regularization Constant
Abstract
This letter investigates a tight convex relaxation to the problem of tuning the regularization constant with respect to a validation based criterion. A number of algorithms is covered including ridge regression, regularization networks, smoothing splines, and least squares support vector machines (LS-SVMs) for regression. This convex approach allows the application of reliable and efficient tools, thereby improving computational cost and automatization of the learning method. It is shown that all solutions of the relaxation allow an interpretation in terms of a solution to a weighted LS-SVM.
Year
DOI
Venue
2007
10.1109/TNN.2007.891187
IEEE Transactions on Neural Networks
Keywords
Field
DocType
least squares approximations,regression analysis,splines (mathematics),support vector machines,convex relaxation,least squares support vector machines,regularization constant,regularization networks,ridge regression,smoothing splines,validation-based learning,Convex optimization,model selection,regularization
Least squares,Mathematical optimization,Convex combination,Computer science,Regularization (mathematics),Artificial intelligence,Proximal gradient methods for learning,Proper convex function,Convex optimization,Convex analysis,Machine learning,Regularization perspectives on support vector machines
Journal
Volume
Issue
ISSN
18
3
1045-9227
Citations 
PageRank 
References 
5
0.42
10
Authors
3
Name
Order
Citations
PageRank
K. Pelckmans114610.03
Johan A. K. Suykens263553.51
Bart De Moor35541474.71