Title
Speed up training of the recurrent neural network based on constrained optimization techniques
Abstract
In this paper, the constrained optimization technique for a substantial problem is explored, that is accelerating training the globally recurrent neural network. Unlike most of the previous methods in feedforward neural networks, the authors adopt the constrained optimization technique to improve the gradientbased algorithm of the globally recurrent neural network for the adaptive learning rate during training. Using the recurrent network with the improved algorithm, some experiments in two real-world problems, namely, filtering additive noises in acoustic data and classification of temporal signals for speaker identification, have been performed. The experimental results show that the recurrent neural network with the improved learning algorithm yields significantly faster training and achieves the satisfactory performance.
Year
DOI
Venue
1996
10.1007/BF02951621
J. Comput. Sci. Technol.
Keywords
Field
DocType
recurrent neural network,neural network,feedforward neural network,constrained optimization
Feedforward neural network,Computer science,Stochastic neural network,Recurrent neural network,Probabilistic neural network,Time delay neural network,Types of artificial neural networks,Artificial intelligence,Deep learning,Machine learning,Constrained optimization
Journal
Volume
Issue
ISSN
11
6
1860-4749
Citations 
PageRank 
References 
0
0.34
0
Authors
5
Name
Order
Citations
PageRank
Ke Chen175060.37
Weiquan Bao200.34
Huisheng Chi321122.81
陈珂420.71
包威权500.34