Title
A Dynamically Stabilized Recurrent Neural Network
Abstract
This work proposes a novel recurrent neural network architecture, called the Dynamically Stabilized Recurrent Neural Network (DSRNN). The developed DSRNN includes learnable skip-connections across a specified number of time-steps, which allows for a state-space representation of the network’s hidden-state trajectory, and a regularization term is introduced in the loss function in the setting of Lyapunov stability theory. The regularizer enables the placement of eigenvalues of the (linearized) transfer function matrix to desired locations in the complex plane, thereby acting as an internal controller for the hidden-state trajectories. In this way, the DSRNN adjusts the weights of temporal skip-connections to achieve recurrent hidden-state stability, which mitigates the problems of vanishing and exploding gradients. The efficacy of the DSRNN is demonstrated on a forecasting task of a recorded double-pendulum experimental model. The results show that the DSRNN outperforms both the Long Short-Term Memory (LSTM) and vanilla recurrent neural networks, and the relative mean-squared error of the LSTM is reduced by up to $$\sim $$ 99.64%. The DSRNN also showed comparable results to the LSTM on a classification task of two Lorenz oscillator systems.
Year
DOI
Venue
2022
10.1007/s11063-021-10676-7
Neural Processing Letters
Keywords
DocType
Volume
Recurrent neural networks, Long short-term memory, Lyapunov stability
Journal
54
Issue
ISSN
Citations 
2
1370-4621
0
PageRank 
References 
Authors
0.34
10
4
Name
Order
Citations
PageRank
Samer Saab Jr.111.72
Yiwei Fu200.34
Ray, A.3832184.32
Michael Hauser400.34