Abstract | ||
---|---|---|
Feature selection is an important issue in pattern recognition and machine learning, which aims at selecting relevant features from a set of candidates. Obviously, the establishment of proper criteria to evaluate the relevance of features is pivotal to the selection. In this paper, a criterion is proposed to assess the relevance of individual input features based on Radial Basis Function Neural Networks (RBFNNs) involved in solving classification and regression problems. The criterion takes a quantified output sensitivity of RBFNNs to input variation as a measure, which is defined as a mathematical expectation and can in a statistic sense reflect the effect of an RBFNN's input variation on its output. The basic idea is that a well-trained RBFNN can capture relevant features of the problem it deals with and thus become more sensitive to the variation of those input features that make more contributions to the RBFNN's behavior. Since the sensitivity is difficult to exactly compute, a numerical integral technique is employed to approximately compute the sensitivity. Experimental results on several artificial and real datasets show that our proposed feature selection approach works well. (c) 2017 Elsevier B.V. All rights reserved. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1016/j.neucom.2017.10.055 | NEUROCOMPUTING |
Keywords | Field | DocType |
Feature selection,RBFNN,Sensitivity,Relevance | Pattern recognition,Feature selection,Statistic,Radial basis function neural,Computer science,Expected value,Artificial intelligence,Regression problems,Machine learning | Journal |
Volume | ISSN | Citations |
275 | 0925-2312 | 0 |
PageRank | References | Authors |
0.34 | 37 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Xiaoqin Zeng | 1 | 407 | 32.97 |
Zhilong Zhen | 2 | 6 | 1.15 |
Jiasheng He | 3 | 0 | 1.01 |
Lixin Han | 4 | 135 | 14.47 |