Abstract | ||
---|---|---|
This paper adopts a wrapper method to find a subset of features that are most relevant to the classification task. The approach utilizes an improved estimation of the conditional mutual information which is used as an independent measure for feature ranking in the local search operations. Meanwhile, the mutual information between the predictive labels of a trained classifier and the true classes is used as the fitness function in the global search for the best subset of features. Thus, the local and global searches consist of a hybrid genetic algorithm for feature selection. Experimental results demonstrate both parsimonious feature selection and excellent classification accuracy of the method on a range of benchmark data sets. |
Year | DOI | Venue |
---|---|---|
2006 | 10.1109/ICPR.2006.198 | ICPR (2) |
Keywords | Field | DocType |
feature selection,excellent classification accuracy,classification task,mutual information,best subset,parsimonious feature selection,feature ranking,local search operation,conditional mutual information,global search,fitness function,genetic algorithms,local search | Data mining,Pattern recognition,Feature selection,Computer science,Feature (computer vision),Fitness function,Artificial intelligence,Mutual information,Local search (optimization),Conditional mutual information,Classifier (linguistics),Genetic algorithm | Conference |
ISSN | ISBN | Citations |
1051-4651 | 0-7695-2521-0 | 12 |
PageRank | References | Authors |
0.84 | 8 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jinjie Huang | 1 | 156 | 7.63 |
Yunze Cai | 2 | 346 | 24.82 |
Xiaoming Xu | 3 | 43 | 5.11 |