Abstract | ||
---|---|---|
This paper presents a strategy to optimize the learning phase of the Support Vector Machines algorithm (SVM). The SVM algorithm is widely used in solving different tasks like classification, regression, density estimation and clustering problems. However, the algorithm presents important disadvantages when learning large scale problems. Training a SVM involves finding the solution of a quadratic optimization problem (QP), which is very resource consuming. What is more, during the learning step, the best working set must be selected, which is a hard to perform task. In this work, we combine a heuristic approach, which selects the best working set data, with a projected conjugate gradient method, which is a fast and easy to implement algorithm that solves the quadratic programming problem involved in the SVM algorithm. We compare the performances of the optimization strategies using some well-known benchmark databases. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1007/978-3-642-05258-3_21 | MICAI |
Keywords | Field | DocType |
support vector | Structured support vector machine,Ranking SVM,Computer science,Artificial intelligence,Quadratic programming,Cluster analysis,Mathematical optimization,Heuristic,Least squares support vector machine,Pattern recognition,Support vector machine,Sequential minimal optimization,Machine learning | Conference |
Volume | ISSN | Citations |
5845 | 0302-9743 | 0 |
PageRank | References | Authors |
0.34 | 5 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ariel García-Gamboa | 1 | 3 | 1.22 |
Neil Hernandez-Gress | 2 | 28 | 8.51 |
Miguel González Mendoza | 3 | 4 | 2.46 |
Jaime Mora-Vargas | 4 | 0 | 0.34 |