Abstract | ||
---|---|---|
Injecting noise to the inputs during the training of feedforward neural networks (FNN) can improve their generalization performance remarkably. Reported works justify this fact arguing that noise injection is equivalent to a smoothing regularization with the input noise variance playing the role of the regularization parameter. The success of this approach depends on the appropriate choice of the input noise variance. However, it is often not known a priori if the degree of smoothness imposed on the FNN mapping is consistent with the unknown function to be approximated. In order to have a better control over this smoothing effect, a cost function putting in balance the smoothed fitting induced by the noise injection and the precision of approximation, is proposed. The second term, which aims at penalizing the undesirable effect of input noise injection or controlling the deviation of the random perturbed cost, was obtained by expressing a certain distance between the original cost function and its random perturbed version. In fact, this term can be derived in general for parametrical. models that satisfy the Lipschitz property. An example is included to illustrate the effectiveness of learning with this proposed cost function when noise injection is used. |
Year | DOI | Venue |
---|---|---|
2002 | 10.1109/NNSP.2002.1030026 | NNSP |
Keywords | Field | DocType |
feedforward neural nets,function approximation,generalisation (artificial intelligence),learning (artificial intelligence),smoothing methods,fnn,lipschitz property,cost function,deviation control,feedforward neural networks,generalization performance,input noise injection,input noise variance,learning,random perturbed cost,smoothing regularization,training,unknown function approximation,approximation algorithms,neural networks,learning artificial intelligence,feedforward neural network,parametric model,satisfiability,fuzzy control | Approximation algorithm,Feedforward neural network,Function approximation,Control theory,Regularization (mathematics),Smoothing,Lipschitz continuity,Artificial neural network,Smoothness,Mathematics | Conference |
ISBN | Citations | PageRank |
0-7803-7616-1 | 0 | 0.34 |
References | Authors | |
8 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Abd-Krim Seghouane | 1 | 78 | 12.27 |
Moudden, Y. | 2 | 0 | 0.34 |
G. A. Fleury | 3 | 151 | 27.74 |