Title
Twin least squares support vector regression
Abstract
In this paper, combining the spirit of twin hyperplanes with the fast speed of least squares support vector regression (LSSVR) yields a new regressor, termed as twin least squares support vector regression (TLSSVR). As a result, TLSSVR outperforms normal LSSVR in the generalization performance, and as opposed to other algorithms of twin hyperplanes, TLSSVR owns faster computational speed. When coping with large scale problems, this advantage is obvious. To accelerate the testing speed of TLSSVR, TLSSVR is sparsified using a simple mechanism, thus obtaining STLSSVR. In addition to introducing these algorithms above, a lot of experiments including a toy problem, several small and large scale data sets, and a gas furnace example are done. These applications demonstrate the effectiveness and efficiency of the proposed algorithms.
Year
DOI
Venue
2013
10.1016/j.neucom.2013.03.005
Neurocomputing
Keywords
Field
DocType
gas furnace example,normal lssvr,fast speed,computational speed,generalization performance,testing speed,squares support vector regression,large scale data set,twin hyperplanes,large scale problem,support vector machine,kernel method,least squares,support vector regression
Least squares,Mathematical optimization,Least squares support vector machine,Support vector machine,Partial least squares regression,Generalized least squares,Artificial intelligence,Non-linear least squares,Total least squares,Kernel method,Mathematics,Machine learning
Journal
Volume
ISSN
Citations 
118,
0925-2312
20
PageRank 
References 
Authors
0.63
38
3
Name
Order
Citations
PageRank
Yongping Zhao121112.75
Jing Zhao2200.63
Min Zhao3215.73