Abstract | ||
---|---|---|
In Kernel Space, Support Vectors selection is an important issue for Support Vector Machines (SVMs). But, at present most sample selection methods have a common disadvantage that the candidate set for Support Vectors is the whole sample space, so, it may select interior samples or “outliers” that have little or even bad effect on the classifying quality. To tackle it, two improved methods based on effective candidate set are proposed in the paper. By using these two methods, the effective candidate set is firstly identified through “removing center” and eliminating “outliners”, and then Support Vectors are selected in this effective candidate set. Experimental results show that the methods reserved effective candidate samples undoubtedly, and also improved the performance of the SVMs classifier in kernel space. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1109/CSSE.2008.942 | CSSE (1) |
Keywords | Field | DocType |
support vector machines,support vectors selection,support vectors,effective method,sample selection method,svms classifier,effective candidate set,improved method,interior sample,kernel space,effective candidate sample,support vector,support vector machine,clustering algorithms,face recognition,accuracy,kernel | Graph kernel,Radial basis function kernel,Least squares support vector machine,Pattern recognition,Kernel embedding of distributions,Computer science,Support vector machine,Polynomial kernel,Artificial intelligence,String kernel,Kernel method,Machine learning | Conference |
Citations | PageRank | References |
0 | 0.34 | 4 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wang Zhan-qing | 1 | 0 | 0.34 |
Wang Chuan-ting | 2 | 0 | 0.34 |
Hou Feng | 3 | 0 | 0.34 |