Abstract | ||
---|---|---|
Support Vector Machines (SVM's) with various kernels have become very successful in pattern classification and regression. However, single kernels do not lead to optimal data models. Replacing the input space by a kernel-based feature space in which the linear discrimination problem with margin maximization is solved is a general method that allows for mixing various kernels and adding new types of features. We show here how to generate locally optimized kernels that facilitate multi-resolution and can handle complex data distributions using simpler models than the standard data formulation may provide. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1007/978-3-642-29347-4_48 | ICAISC (1) |
Keywords | Field | DocType |
optimized kernel,support vector machines,input space,complex data distribution,margin maximization,data model,kernel-based feature space,various kernel,standard data formulation,general method,linear discrimination problem,database management,artificial intelligence | Kernel (linear algebra),Data mining,Data modeling,Feature vector,Pattern recognition,Regression,Computer science,Support vector machine,Complex data type,Artificial intelligence,Machine learning,Margin maximization | Conference |
Volume | ISSN | Citations |
7267 | 0302-9743 | 0 |
PageRank | References | Authors |
0.34 | 9 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
tomasz maszczyk | 1 | 42 | 5.29 |
Włodzisław Duch | 2 | 291 | 28.95 |