Title | ||
---|---|---|
Support Vector Regression for the simultaneous learning of a multivariate function and its derivatives |
Abstract | ||
---|---|---|
In this paper, the problem of simultaneously approximating a function and its derivatives is formulated within the Support Vector Machine (SVM) framework. First, the problem is solved for a one-dimensional input space by using the @e-insensitive loss function and introducing additional constraints in the approximation of the derivative. Then, we extend the method to multi-dimensional input spaces by a multidimensional regression algorithm. In both cases, to optimize the regression estimation problem, we have derived an iterative re-weighted least squares (IRWLS) procedure that works fast for moderate-size problems. The proposed method shows that using the information about derivatives significantly improves the reconstruction of the function. |
Year | DOI | Venue |
---|---|---|
2005 | 10.1016/j.neucom.2005.02.013 | Neurocomputing |
Keywords | Field | DocType |
e-insensitive loss function,support vector regression,support vector machine,input space,additional constraint,multidimensional regression algorithm,regression estimation problem,one-dimensional input space,multivariate function,moderate-size problem,simultaneous learning,svm,loss function | Structured support vector machine,Least squares,Pattern recognition,Least squares support vector machine,Regression,Multivariate statistics,Support vector machine,Variance function,Artificial intelligence,Relevance vector machine,Mathematics,Machine learning | Journal |
Volume | Issue | ISSN |
69 | 1-3 | Neurocomputing |
Citations | PageRank | References |
15 | 0.95 | 9 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Marcelino Lázaro | 1 | 78 | 11.34 |
Ignacio Santamaría | 2 | 941 | 81.56 |
Fernando Pérez-Cruz | 3 | 749 | 61.24 |
Antonio Artés-Rodríguez | 4 | 206 | 34.76 |