Abstract | ||
---|---|---|
In pattern recognition, feature selection aims to choose the smallest subset of features that is necessary and sufficient to describe the target concept. In this paper, a mutual information-based constructive criterion under arbitrary information distributions of input features is presented for feature selection. This criterion can capture both the relevance to the output classes and the redundancy with respect to the already-selected features without any parameters like beta in MIFS or MIFS-U methods to be preset. Furthermore, a modified greedy feature selection algorithm called MICC is proposed, and experimental results demonstrate the good performance of MICC on both synthetic and benchmark data sets |
Year | DOI | Venue |
---|---|---|
2006 | 10.1109/COGINF.2006.365681 | IEEE ICCI |
Keywords | Field | DocType |
mutual information-based constructive criterion,pattern recognition,pattern classification,micc,learning (artificial intelligence),feature extraction,greedy algorithms,filtering theory,feature selection,mutual information,machine learning,filter approach,greedy feature selection algorithm,noise measurement,computational complexity,concrete,learning artificial intelligence | Data mining,Pattern recognition,Feature selection,Computer science,Feature (computer vision),Greedy algorithm,Feature extraction,Minimum redundancy feature selection,Redundancy (engineering),Mutual information,Artificial intelligence,Computational complexity theory | Conference |
Volume | ISBN | Citations |
1 | 1-4244-0475-4 | 11 |
PageRank | References | Authors |
0.71 | 10 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jinjie Huang | 1 | 156 | 7.63 |
Yunze Cai | 2 | 346 | 24.82 |
Xiaoming Xu | 3 | 223 | 10.15 |