Abstract | ||
---|---|---|
The use of data represented by intervals can be caused by imprecision in the input information, incompleteness in patterns, discretization procedures, prior knowledge insertion or speed-up learning. All the existing support vector machine (SVM) approaches working on interval data use local kernels based on a certain distance between intervals, either by combining the interval distance with a kernel or by explicitly defining an interval kernel. This article introduces a new procedure for the linearly separable case, derived from convex optimization theory, inserting information directly into the standard SVM in the form of intervals, without taking any particular distance into consideration. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1016/j.neucom.2007.12.025 | Neurocomputing |
Keywords | Field | DocType |
support vector machine,discretization procedure,interval kernel,local kernel,certain distance,interval data,convex optimization theory,interval discriminant analysis,standard svm,input information,particular distance,interval distance,interval analysis,convex optimization,discriminant analysis,kernel machine,classification | Least squares support vector machine,Radial basis function kernel,Pattern recognition,Kernel embedding of distributions,Support vector machine,Kernel Fisher discriminant analysis,Polynomial kernel,Artificial intelligence,String kernel,Kernel method,Machine learning,Mathematics | Journal |
Volume | Issue | ISSN |
71 | 7-9 | Neurocomputing |
Citations | PageRank | References |
12 | 0.60 | 12 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
C. Angulo | 1 | 93 | 5.82 |
D. Anguita | 2 | 25 | 1.77 |
L. Gonzalez-Abril | 3 | 153 | 8.48 |
J. A. Ortega | 4 | 14 | 1.04 |