Abstract | ||
---|---|---|
In this paper we present a primal-dual decomposition algorithm for support vector machine training. As with existing methods that use very small working sets (such as Sequential Minimal Optimization (SMO), Successive Over-Relaxation (SOR) or the Kernel Adatron (KA)), our method scales well, is straightforward to implement, and does not require an external QP solver. Unlike SMO, SOR and KA, the method is applicable to a large number of SVM formulations regardless of the number of equality constraints involved. The effectiveness of our algorithm is demonstrated on a more difficult SVM variant in this respect, namely semi-parametric support vector regression. |
Year | DOI | Venue |
---|---|---|
2005 | 10.1007/11564096_21 | ECML |
Keywords | Field | DocType |
primal-dual decomposition algorithm,svm formulation,difficult svm variant,kernel adatron,semi-parametric support vector regression,support vector machine training,large number,successive over-relaxation,sequential minimal optimization,multiple equality constraint,training support vector machine,method scale,successive over relaxation,support vector machine,support vector regression | Kernel (linear algebra),Mathematical optimization,Computer science,Support vector machine,Algorithm,Solver,Kernel method,Sequential minimal optimization | Conference |
Volume | ISSN | ISBN |
3720 | 0302-9743 | 3-540-29243-8 |
Citations | PageRank | References |
9 | 0.76 | 7 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wolf Kienzle | 1 | 391 | 20.73 |
Bernhard Schölkopf | 2 | 23120 | 3091.82 |