Abstract | ||
---|---|---|
Support vector machine (SVM) was initially designed for binary classification. To extend SVM to the multi-class scenario, a number of classification models were proposed such as the one by Crammer and Singer (2001). However, the number of variables in Crammer and Singer's dual problem is the product of the number of samples (l) by the number of classes (k), which produces a large computational complexity. This paper presents a simplified multi-class SVM (SimMSVM) that reduces the size of the resulting dual problem from lxk to l by introducing a relaxed classification error bound. The experimental results demonstrate that the proposed SimMSVM approach can greatly speed-up the training process, while maintaining a competitive classification accuracy. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1016/j.patrec.2011.09.035 | Pattern Recognition Letters |
Keywords | Field | DocType |
large computational complexity,multi-class svm,dual problem,proposed simmsvm approach,competitive classification accuracy,classification model,reduced dual optimization,classification error,multi-class scenario,binary classification,multi-class support vector machine,support vector machine,multi class classification | Structured support vector machine,Pattern recognition,Binary classification,Support vector machine,Algorithm,Duality (optimization),Artificial intelligence,Relevance vector machine,Mathematics,Computational complexity theory,Multiclass classification | Journal |
Volume | Issue | ISSN |
33 | 1 | 0167-8655 |
Citations | PageRank | References |
13 | 0.52 | 26 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Xisheng He | 1 | 20 | 1.30 |
Zhe Wang | 2 | 268 | 18.89 |
Cheng Jin | 3 | 78 | 14.92 |
Yingbin Zheng | 4 | 191 | 16.70 |
Xiangyang Xue | 5 | 2466 | 154.25 |