Title
Effects of Noise on Convergence and Generalization in Recurrent Networks
Abstract
We introduce and study methods of inserting synaptic noise into dynamically-driven recurrent neural networks and show that ap(cid:173) plying a controlled amount of noise during training may improve convergence and generalization. In addition, we analyze the effects of each noise parameter (additive vs. multiplicative, cumulative vs. non-cumulative, per time step vs. per string) and predict that best overall performance can be achieved by injecting additive noise at each time step. Extensive simulations on learning the dual parity grammar from temporal strings substantiate these predictions.
Year
Venue
Field
1994
NIPS
Convergence (routing),Multiplicative function,Computer science,Recurrent neural network,Grammar,Artificial intelligence,Machine learning,Synaptic noise
DocType
Citations 
PageRank 
Conference
10
1.67
References 
Authors
6
3
Name
Order
Citations
PageRank
Kam-Chuen Jim110713.01
Bill G. Horne233433.87
C. Lee Giles3111541549.48