Title
Recurrent neural networks training with stable bounding ellipsoid algorithm.
Abstract
Bounding ellipsoid (BE) algorithms offer an attractive alternative to traditional training algorithms for neural networks, for example, backpropagation and least squares methods. The benefits include high computational efficiency and fast convergence speed. In this paper, we propose an ellipsoid propagation algorithm to train the weights of recurrent neural networks for nonlinear systems identification. Both hidden layers and output layers can be updated. The stability of the BE algorithm is proven.
Year
DOI
Venue
2009
10.1109/TNN.2009.2015079
IEEE Transactions on Neural Networks
Keywords
Field
DocType
lyapunov-like technique,neural network,ellipsoid algorithm,identification,hidden layer,learning (artificial intelligence),high computational efficiency,nonlinear systems identification,recurrent neural networks,convergence speed,nonlinear systems,ellipsoid propagation algorithm,stable bounding ellipsoid algorithm,traditional training algorithm,bounding ellipsoid (be),bounding ellipsoid,recurrent neural network training,attractive alternative,recurrent neural nets,recurrent neural network,recurrent neural networks training,lyapunov methods,neural networks,convergence,learning artificial intelligence,algorithms,feedback,backpropagation,ellipsoids,function approximation,indexing terms,least square method,feedforward neural networks,computer simulation,uncertainty
Feedforward neural network,Ellipsoid,Pattern recognition,Computer science,Recurrent neural network,Artificial intelligence,System identification,Backpropagation,Artificial neural network,Ellipsoid method,Machine learning,Bounding overwatch
Journal
Volume
Issue
ISSN
20
6
1941-0093
Citations 
PageRank 
References 
16
0.81
18
Authors
2
Name
Order
Citations
PageRank
Wen Yu128322.70
José De Jesús Rubio257436.29