Abstract | ||
---|---|---|
In this paper, a novel sparse least squares support vector regression algorithm, referred to as LSSVR-SBF, is introduced which uses a new low rank kernel based on simplex basis function, which has a set of nonlinear parameters. It is shown that the proposed model can be represented as a sparse linear regression model based on simplex basis functions. We propose a fast algorithm for least squares support vector regression solution at the cost of O(N) by avoiding direct kernel matrix inversion. An iterative estimation algorithm has been proposed to optimize the nonlinear parameters associated with the simplex basis functions with the aim of minimizing model mean square errors using the gradient descent algorithm. The proposed fast least square solution and the gradient descent algorithm are alternatively applied. Finally it is shown that the model has a dual representation as a piecewise linear model with respect to the system input. Numerical experiments are carried out to demonstrate the effectiveness of the proposed approaches. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1016/j.neucom.2018.11.025 | Neurocomputing |
Keywords | Field | DocType |
Least squares support vector regression,Low rank kernels,Simplex basis function | Kernel (linear algebra),Least squares,Mean square,Gradient descent,Pattern recognition,Support vector machine,Algorithm,Simplex,Basis function,Artificial intelligence,Mathematics,Linear regression | Journal |
Volume | ISSN | Citations |
330 | 0925-2312 | 0 |
PageRank | References | Authors |
0.34 | 13 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
X Hong | 1 | 216 | 19.36 |
Richard Mitchell | 2 | 86 | 14.57 |
Giuseppe Di Fatta | 3 | 529 | 39.23 |