Abstract | ||
---|---|---|
By appropriate editing of the reference set and judicious selection of features, we can obtain an optimal nearest neighbor (NN) classifier that maximizes the accuracy of classification and saves computational time and memory resources. In this paper, we propose a new method for simultaneous reference set editing and feature selection for a nearest neighbor classifier. The proposed method is based on the genetic algorithm and employs different genetic encoding strategies according to the size of the problem, such that it can be applied to classification problems of various scales. Compared with the conventional methods, the classifier uses some of the considered references and features, not all of them, but demonstrates better classification performance. To demonstrate the performance of the proposed method, we perform experiments on various databases. |
Year | DOI | Venue |
---|---|---|
2010 | 10.1016/j.patrec.2010.01.001 | Pattern Recognition Letters |
Keywords | Field | DocType |
feature selection,conventional method,efficient design,various-scale problem,genetic algorithm,editing for the nn,better classification performance,nearest neighbor classifier,different genetic encoding strategy,appropriate editing,classification problem,new method,nearest neighbor,genetics | k-nearest neighbors algorithm,Data mining,Pattern recognition,Feature selection,Best bin first,Nearest-neighbor chain algorithm,Artificial intelligence,Large margin nearest neighbor,Classifier (linguistics),Mathematics,Nearest neighbor search,Genetic algorithm | Journal |
Volume | Issue | ISSN |
31 | 9 | Pattern Recognition Letters |
Citations | PageRank | References |
1 | 0.36 | 12 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Heesung Lee | 1 | 90 | 9.01 |
Sungjun Hong | 2 | 47 | 5.58 |
Imran Fareed Nizami | 3 | 10 | 3.52 |
Euntai Kim | 4 | 1472 | 109.36 |