Abstract | ||
---|---|---|
The support vector machine (SVM) is a new learning methodology based on Vapnik-Chervonenkis (VC) theory (Vapnik, 1982, 1995). SVM has recently attracted growing research interest due to its ability to learn classification and regression tasks with high-dimensional data. The SVM formulation uses kernel representation. The existing algorithm leaves the choice of the kernel type and kernel parameters to the user. This paper describes an important extension to the SVM method: the multiresolution SVM (M-SVM) in which several kernels of different scales can be used simultaneously to approximate the target function. The proposed M-SVM approach enables `automatic' selection of the `optimal' kernel width. This usually results in better prediction accuracy of SVM models |
Year | DOI | Venue |
---|---|---|
1999 | 10.1109/IJCNN.1999.831103 | IJCNN |
Keywords | Field | DocType |
learning (artificial intelligence),neural nets,optimisation,pattern classification,signal processing,statistical analysis,m-svm,svm,vc theory,vapnik-chervonenkis theory,classification tasks,high-dimensional data,kernel parameters,kernel representation,learning methodology,multiresolution support vector machine,optimal kernel width,regression tasks,support vector machines,frequency,learning artificial intelligence,polynomials,machine learning,support vector machine,multiresolution analysis,kernel,high dimensional data,signal analysis | Structured support vector machine,Pattern recognition,Radial basis function kernel,Least squares support vector machine,Ranking SVM,Computer science,Support vector machine,Tree kernel,Polynomial kernel,Artificial intelligence,Kernel method,Machine learning | Conference |
Volume | ISSN | ISBN |
2 | 1098-7576 | 0-7803-5529-6 |
Citations | PageRank | References |
3 | 0.56 | 0 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
X Shao | 1 | 198 | 23.20 |
Vladimir Cherkassky | 2 | 1064 | 126.66 |