Abstract | ||
---|---|---|
Support Vector Machine (SVM) is one of the most important class of machine learning models and algorithms, and has been successfully applied in various fields. Nonlinear optimization plays a crucial role in SVM methodology, both in defining the machine learning models and in designing convergent and efficient algorithms for large-scale training problems. In this paper we present the convex programming problems underlying SVM focusing on supervised binary classification. We analyze the most important and used optimization methods for SVM training problems, and we discuss how the properties of these problems can be incorporated in designing useful algorithms. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1007/s10288-018-0378-2 | 4OR |
Keywords | Field | DocType |
Statistical learning theory, Support vector machine, Convex quadratic programming, Wolfe’s dual theory, Kernel functions, Nonlinear optimization methods, 65K05 Mathematical programming methods, 90C25 Convex programming, 90C30 Nonlinear programming | Statistical learning theory,Management information systems,Mathematical optimization,Binary classification,Support vector machine,Nonlinear programming,Convex quadratic programming,Artificial intelligence,Convex optimization,Mathematics,Kernel (statistics) | Journal |
Volume | Issue | ISSN |
16 | 2 | 1619-4500 |
Citations | PageRank | References |
2 | 0.35 | 50 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Veronica Piccialli | 1 | 259 | 20.63 |
M. Sciandrone | 2 | 335 | 29.01 |