Abstract | ||
---|---|---|
Many regression tasks in practice dispose in low gear instance of digitized functions as predictor variables. This has motivated the development of regression methods for functional data. In particular, Naradaya-Watson Kernel (NWK) and Radial Basis Function (RBF) estimators have been recently extended to functional nonparametric regression models. However, these methods do not allow for dimensionality reduction. For this purpose, we introduce Support Vector Regression (SVR) methods for functional data. These are formulated in the framework of approximation in reproducing kernel Hilbert spaces. On this general basis, some of its properties are investigated, emphasizing the construction of nonnegative definite kernels on functional spaces. Furthermore, the performance of SVR for functional variables is shown on a real world benchmark spectrometric data set, as well as comparisons with NWK and RBF methods. Good predictions were obtained by these three approaches, but SVR achieved in addition about 20% reduction of dimensionality. |
Year | DOI | Venue |
---|---|---|
2007 | 10.1007/978-3-540-76725-1_59 | CIARP |
Keywords | Field | DocType |
support vector regression,kernel function,radial basis function,reproducing kernel hilbert space,function space,nonparametric regression | Dimensionality reduction,Principal component regression,Pattern recognition,Radial basis function kernel,Kernel embedding of distributions,Computer science,Nonparametric regression,Polynomial regression,Polynomial kernel,Artificial intelligence,Machine learning,Kernel (statistics) | Conference |
Volume | ISSN | ISBN |
4756 | 0302-9743 | 3-540-76724-X |
Citations | PageRank | References |
3 | 0.51 | 4 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Noslen Hernández | 1 | 7 | 4.57 |
Rolando J. Biscay | 2 | 12 | 3.54 |
Isneri Talavera | 3 | 17 | 3.75 |