Title
A recalling-enhanced recurrent neural network: conjugate gradient learning algorithm and its convergence analysis
Abstract
Elman network is a classical recurrent neural network with an internal delay feedback. In this paper, we propose a recalling-enhanced recurrent neural network (RERNN) which has a selective memory property. In addition, an improved conjugate algorithm with generalized Armijo search technique that speeds up the convergence rate is used to train the RERNN model. Further enhancement performance is achieved with adaptive learning coefficients. Finally, we prove weak and strong convergence of the presented algorithm. In other words, as the number of training steps increases, the following has been established for RERNN: (1) the gradient norm of the error function with respect to the weight vectors converges to zero, (2) the weight sequence approaches a fixed optimal point. We have carried out a number of simulations to illustrate and verify the theoretical results that demonstrate the efficiency of the proposed algorithm.
Year
DOI
Venue
2020
10.1016/j.ins.2020.01.045
Information Sciences
Keywords
Field
DocType
Recurrent,Neural network,Conjugate gradient,Generalized Armijo search,Monotonicity,Convergence
Conjugate gradient method,Convergence (routing),Error function,Recurrent neural network,Algorithm,Conjugate,Rate of convergence,Adaptive learning,Mathematics
Journal
Volume
ISSN
Citations 
519
0020-0255
0
PageRank 
References 
Authors
0.34
0
7
Name
Order
Citations
PageRank
Gao Tao13615.22
Xiaoling Gong201.01
Kai Zhang36828.31
Feng Lin400.34
Jian Wang5447.22
Tingwen Huang65684310.24
Jacek M. Zurada72553226.22