Title
Accelerated gradient algorithm for RBF neural network
Abstract
Gradient-based algorithms are commonly used for training radial basis function neural network (RBFNN). However, one of the challenges in the training process is determining how to avoid vanishing gradient. To solve this problem, an accelerated gradient algorithm (AGA) is designed to improve the learning performance of RBFNN in this paper. First, an indirect detection mechanism, based on the instantaneous gradient decay rate (IGDR) and instantaneous convergence rate (ICR), is developed to identify the vanishing gradient in learning process. Second, an amplification gradient strategy (AGS), which can increase the gradient value of learning parameters, is designed to accelerate the learning speed of RBFNN. Third, the analysis of AGA-based RBFNN (AGA-RBFNN) is given to guarantee the successful application. Finally, some benchmark and real problems are used to illustrate the effectiveness of AGA-RBFNN. The results demonstrate the effectiveness of AGA-RBFNN.
Year
DOI
Venue
2021
10.1016/j.neucom.2021.02.009
Neurocomputing
Keywords
DocType
Volume
Radial basis function neural network,Instantaneous gradient decay rate,Instantaneous convergence rate,Amplification gradient strategy,Accelerated gradient algorithm
Journal
441
ISSN
Citations 
PageRank 
0925-2312
1
0.35
References 
Authors
0
3
Name
Order
Citations
PageRank
Hong-Gui Han147639.06
Miaoli Ma210.35
Jun-Fei Qiao36915.62