Abstract | ||
---|---|---|
Based on KKT condition and Lagrangian multiplier method a weighted SVM regression model and its on-line training algorithm are developed. Standard SVM regression model processes every sample equally with the same error requirement, which is not suitable in the case that different sample has different contribution to the construction of the regression model. In the new weighted model, every training sample is given a weight coefficient to reflect the difference among samples. Moreover, standard online training algorithm couldn't remove redundant samples effectively. A new method is presented to remove the redundant samples. Simulation with a benchmark problem shows that the new algorithm can quickly and accurately approximate nonlinear and time-varying functions with less computer memory needed. |
Year | DOI | Venue |
---|---|---|
2005 | 10.1007/11539087_95 | ICNC (1) |
Keywords | Field | DocType |
weighted svm regression model,new weighted model,new method,new algorithm,standard online training algorithm,on-line training algorithm,different sample,regression model,weighted on-line svm regression,redundant sample,standard svm regression model | Mathematical optimization,Nonlinear system,Regression analysis,Computer science,Lagrange multiplier,Svm regression,Algorithm,Weight coefficient,Redundancy (engineering),Karush–Kuhn–Tucker conditions,Computer memory | Conference |
Volume | ISSN | ISBN |
3610 | 0302-9743 | 3-540-28323-4 |
Citations | PageRank | References |
1 | 0.34 | 5 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hui Wang | 1 | 175 | 35.62 |
Daoying Pi | 2 | 50 | 9.21 |
Youxian Sun | 3 | 2707 | 196.15 |