Abstract | ||
---|---|---|
The least squares support vector machine (LSSVM) is computationally efficient because it converts the quadratic programming problem in the training of SVM to a linear programming problem. The sparse LSSVM is proposed to promote the predictive speed and generalization capability. In this paper, two sparse LSSVM algorithms: the SMRLSSVM and the RQRLSSVM are proposed based on the Localized Generalization Error of the LSSVM. Experimental results show that the RQRLSSVM yields both better generalization capability and sparseness in comparison to other sparse LSSVM algorithms. © 2016, Springer-Verlag Berlin Heidelberg. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1007/s13042-016-0563-6 | International Journal of Machine Learning and Cybernetics |
Keywords | Field | DocType |
Least squares support vector machine (LSSVM),Localized generalization error model (L-GEM),Sensitivity measure,Sparsity | Pattern recognition,Least squares support vector machine,Support vector machine,Linear programming,Artificial intelligence,Generalization error,Quadratic programming,Mathematics | Journal |
Volume | Issue | ISSN |
8 | 6 | 18688071 |
Citations | PageRank | References |
4 | 0.38 | 22 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sun Binbin | 1 | 4 | 0.38 |
Wing W. Y. Ng | 2 | 528 | 56.12 |
Patrick P. K. Chan | 3 | 271 | 33.82 |