Title
Automatic Scaling using Gamma Learning for Feedforward Neural Networks
Abstract
Standard error back-propagation requires output data that is scaled to lie within the active area of the activation function. We show that normalizing data to conform to this requirement is not only a time-consuming process, but can also introduce inaccuracies in modelling of the data. In this paper we propose the gamma learning rule for feedforward neural networks which eliminates the need to scale output data before training. We show that the utilization of "self-scaling" units results in...
Year
DOI
Venue
1995
10.1007/3-540-59497-3_198
IWANN
Keywords
Field
DocType
gamma learning,automatic scaling,feedforward neural networks,activation function,feedforward neural network,back propagation,standard error
Convergence (routing),Feedforward neural network,Normalization (statistics),Pattern recognition,Computer science,Activation function,Recurrent neural network,Probabilistic neural network,Learning rule,Time delay neural network,Artificial intelligence,Machine learning
Conference
Volume
ISSN
ISBN
930
0302-9743
3-540-59497-3
Citations 
PageRank 
References 
3
0.43
3
Authors
4
Name
Order
Citations
PageRank
Andries Petrus Engelbrecht12183125.32
Ian Cloete213216.61
J. Geldenhuys330.43
Jacek M. Zurada42553226.22