Abstract | ||
---|---|---|
Practical machine learning algorithms are known to degrade in performance when faced with many features that are not necessary for rule discovery. To cope with this problem, many methods for selecting a subset of features with similar-enough behaviors to merit focused analysis have been proposed. In such methods, the filter approach that selects a feature subset using a preprocessing step, and the wrapper approach that selects an optimal feature subset from the space of possible subsets of features using the induction algorithm itself as a part of the evaluation function, are two typical ones. Although the filter approach is a faster one, it has some blindness and the performance of induction is not considered. On the other hand, the optimal feature subsets can be obtained by using the wrapper approach, but it is not easy to use because the complexity of time and space. In this paper, we propose an algorithm of using the rough set methodology with greedy heuristics for feature selection. In our approach, selecting features is similar as the filter approach, but the performance of induction is considered in the evaluation criterion for feature selection. That is, we select the features that damage the performance of induction as little as possible. |
Year | DOI | Venue |
---|---|---|
1999 | 10.1007/978-3-540-48061-7_22 | RSFDGrC |
Keywords | Field | DocType |
feature selection,rough sets,rough set | Decision table,Feature selection,Pattern recognition,Feature (computer vision),Computer science,Evaluation function,Greedy algorithm,Rough set,Preprocessor,Heuristics,Artificial intelligence,Machine learning | Conference |
Volume | ISSN | ISBN |
1711 | 0302-9743 | 3-540-66645-1 |
Citations | PageRank | References |
17 | 1.29 | 6 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Juzhen Dong | 1 | 214 | 17.05 |
Ning Zhong | 2 | 2907 | 300.63 |
Setsuo Ohsuga | 3 | 960 | 222.02 |