Title
Obtaining Fault Tolerant Multilayer Perceptrons Using an Explicit Regularization
Abstract
When the learning algorithm is applied to a MLP structure, different solutions for the weight values can be obtained if the parameters of the applied rule or the initial conditions are changed. Those solutions can present similar performance with respect to learning, but they differ in other aspects, in particular, fault tolerance against weight perturbations. In this paper, a backpropagation algorithm that maximizes fault tolerance is proposed. The algorithm presented explicitly adds a new term to the backpropagation learning rule related to the mean square error degradation in the presence of weight deviations in order to minimize this degradation. The results obtained demonstrate the efficiency of the learning rule proposed here in comparison with other algorithm.
Year
DOI
Venue
2000
10.1023/A:1009698206772
Neural Processing Letters
Keywords
Field
DocType
backpropagation,regularization,multilayer perceptron,fault tolerance,mean square sensitivity
Delta rule,Algorithm,Mean squared error,Fault tolerance,Learning rule,Multilayer perceptron,Artificial intelligence,Backpropagation,Artificial neural network,Perceptron,Machine learning,Mathematics
Journal
Volume
Issue
ISSN
12
2
1573-773X
Citations 
PageRank 
References 
24
0.92
6
Authors
5
Name
Order
Citations
PageRank
Jose L. Bernier1361.66
J. Ortega294073.05
I. Rojas31750143.09
Eduardo Ros4110086.00
A. Prieto541925.23