Abstract | ||
---|---|---|
A simple learning algorithm for maximal margin classifiers (also support vector machines with quadratic cost function) is proposed. We build our iterative algorithm on top of the Schlesinger–Kozinec algorithm (S–K-algorithm) from 1981 which finds a maximal margin hyperplane with a given precision for separable data. We suggest a generalization of the S–K-algorithm (i) to the non-linear case using kernel functions and (ii) for non-separable data. The requirement in memory storage is linear to the data. This property allows the proposed algorithm to be used for large training problems. |
Year | DOI | Venue |
---|---|---|
2003 | 10.1016/S0031-3203(03)00060-8 | Pattern Recognition |
Keywords | Field | DocType |
Pattern recognition,Linear classifier,Supervised learning,Support vector machines,Kernel functions | Margin (machine learning),Ramer–Douglas–Peucker algorithm,Linde–Buzo–Gray algorithm,Pattern recognition,FSA-Red Algorithm,Artificial intelligence,Margin classifier,Population-based incremental learning,Machine learning,Difference-map algorithm,Weighted Majority Algorithm,Mathematics | Journal |
Volume | Issue | ISSN |
36 | 9 | 0031-3203 |
Citations | PageRank | References |
37 | 2.04 | 2 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Vojtěch Franc | 1 | 584 | 55.78 |
Václav Hlaváč | 2 | 216 | 13.42 |