Title
A cost function for learning feedforward neural networks subject to noisy inputs
Abstract
Most algorithms used for training feedforward neural networks (NN) are based on the minimization of a least squares output error cost function. The use of such a cost function provides good results when the training set is composed of noisy outputs and exactly known inputs. However, when collecting data under an identification experiment, it may not be possible to avoid noise when measuring the inputs. Then, the use of these algorithms estimates biased NN parameters when the training inputs are corrupted by noise, leading to biased predicted outputs. This paper proposes a cost function whose optimisation reduces the effect of the input noise on the estimated NN parameters. Its construction is based on adding a specific regularization tern to the least squares output error cost function. A simulation example is presented to demonstrate the robustness to noisy inputs of the NN trained with this cost function
Year
DOI
Venue
2001
10.1109/ISSPA.2001.950161
Signal Processing and its Applications, Sixth International, Symposium. 2001
Keywords
Field
DocType
feedforward neural nets,learning (artificial intelligence),least squares approximations,optimisation,parameter estimation,signal processing,feedforward neural networks,identification,input noise,learning,least squares,optimisation,output error cost function,parameter estimation,regularization tern,robustness,signal processing,training
Least squares,Training set,Signal processing,Feedforward neural network,Pattern recognition,Computer science,Control theory,Robustness (computer science),Regularization (mathematics),Minification,Artificial intelligence,Estimation theory
Conference
Volume
ISBN
Citations 
2
0-7803-6703-0
0
PageRank 
References 
Authors
0.34
7
2
Name
Order
Citations
PageRank
Abd-Krim Seghouane17812.27
G. A. Fleury215127.74