Title
Support vector regression methods for functional data
Abstract
Many regression tasks in practice dispose in low gear instance of digitized functions as predictor variables. This has motivated the development of regression methods for functional data. In particular, Naradaya-Watson Kernel (NWK) and Radial Basis Function (RBF) estimators have been recently extended to functional nonparametric regression models. However, these methods do not allow for dimensionality reduction. For this purpose, we introduce Support Vector Regression (SVR) methods for functional data. These are formulated in the framework of approximation in reproducing kernel Hilbert spaces. On this general basis, some of its properties are investigated, emphasizing the construction of nonnegative definite kernels on functional spaces. Furthermore, the performance of SVR for functional variables is shown on a real world benchmark spectrometric data set, as well as comparisons with NWK and RBF methods. Good predictions were obtained by these three approaches, but SVR achieved in addition about 20% reduction of dimensionality.
Year
DOI
Venue
2007
10.1007/978-3-540-76725-1_59
CIARP
Keywords
Field
DocType
support vector regression,kernel function,radial basis function,reproducing kernel hilbert space,function space,nonparametric regression
Dimensionality reduction,Principal component regression,Pattern recognition,Radial basis function kernel,Kernel embedding of distributions,Computer science,Nonparametric regression,Polynomial regression,Polynomial kernel,Artificial intelligence,Machine learning,Kernel (statistics)
Conference
Volume
ISSN
ISBN
4756
0302-9743
3-540-76724-X
Citations 
PageRank 
References 
3
0.51
4
Authors
3
Name
Order
Citations
PageRank
Noslen Hernández174.57
Rolando J. Biscay2123.54
Isneri Talavera3173.75