Abstract | ||
---|---|---|
In the pattern recognition research field, Support Vector Machines (SVM) have been an effectiveness tool for classification purposes, being successively employed in many applications. The SVM input data is transformed into a high dimensional space using some kernel functions where linear separation is more likely. However, there are some computational drawbacks associated to SVM. One of them is the computational burden required to find out the more adequate parameters for the kernel mapping considering each non-linearly separable input data space, which reflects the performance of SVM. This paper introduces the Polynomial Powers of Sigmoid for SVM kernel mapping, and it shows their advantages over well-known kernel functions using real and synthetic datasets. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1109/SIBGRAPI.2014.36 | SIBGRAPI |
Keywords | Field | DocType |
support vector machines,kernel functions,machine learning | Least squares support vector machine,Radial basis function kernel,Pattern recognition,Kernel principal component analysis,Tree kernel,Polynomial kernel,Artificial intelligence,Kernel method,String kernel,Variable kernel density estimation,Mathematics | Conference |
Citations | PageRank | References |
0 | 0.34 | 8 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Silas Evandro Nachif Fernandes | 1 | 2 | 1.75 |
Andre Luiz Pilastri | 2 | 0 | 0.68 |
Luís A. M. Pereira | 3 | 129 | 8.87 |
Rafael Goncalves Pires | 4 | 2 | 3.11 |
João Paulo Papa | 5 | 278 | 44.60 |