Title
Data-driven calibration of linear estimators with minimal penalties
Abstract
This paper tackles the problem of selecting among several linear estimators in non-parametric regression; this includes model selection for linear regression, the choice of a regularization parameter in kernel ridge regression or spline smoothing, and the choice of a kernel in multiple kernel learning. We propose a new algorithm which first estimates consistently the variance of the noise, based upon the concept of minimal penalty which was previously introduced in the context of model selection. Then, plugging our variance estimate in Mallows' $C_L$ penalty is proved to lead to an algorithm satisfying an oracle inequality. Simulation experiments with kernel ridge regression and multiple kernel learning show that the proposed algorithm often improves significantly existing calibration procedures such as 10-fold cross-validation or generalized cross-validation.
Year
Venue
Keywords
2009
NIPS
linear regression,simulation experiment,satisfiability,model selection,non parametric regression
Field
DocType
Citations 
Mathematical optimization,Principal component regression,Kernel embedding of distributions,Nonparametric regression,Multiple kernel learning,Polynomial kernel,Variable kernel density estimation,Kernel regression,Mathematics,Kernel (statistics)
Conference
8
PageRank 
References 
Authors
0.79
17
2
Name
Order
Citations
PageRank
Sylvain Arlot1656.87
Francis Bach211490622.29