Abstract | ||
---|---|---|
The sensitivity of a neural network's output to its inputs' perturbations is an important measure for evaluating the network's performance. To make the sensitivity be a practical tool for designing and implementing Multilayer Perceptrons (MLPs), this paper proposes a general approach to quantify the sensitivity of MLPs. The sensitivity is defined as the mathematical expectation of absolute output deviations due to input perturbations with respect to all possible inputs, and computed following a bottom-up way, in which the sensitivity of a neuron is first considered and then is that of the entire network. The main contribution of the approach is that it requests a weak assumption on the input, that is, input elements need only to be independent of each other without being restricted to have a certain type of distribution and thus is more applicable to real applications. Some experimental results on artificial datasets and real datasets demonstrate the proposed approach is highly accurate. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1016/j.neucom.2012.07.020 | Neurocomputing |
Keywords | Field | DocType |
real application,artificial datasets,possible input,input perturbation,input element,absolute output deviation,neural network,multilayer perceptron sensitivity,general approach,entire network,sensitivity,multilayer perceptron | Pattern recognition,Computer science,Multilayer perceptron,Expected value,Artificial intelligence,Artificial neural network,Perceptron,Machine learning,Perturbation (astronomy),Computation | Journal |
Volume | ISSN | Citations |
99, | 0925-2312 | 2 |
PageRank | References | Authors |
0.37 | 22 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jing Yang | 1 | 2 | 0.37 |
Xiaoqin Zeng | 2 | 407 | 32.97 |
Shuiming Zhong | 3 | 79 | 7.30 |