Title
A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures.
Abstract
Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cells or tanh cells are unable to learn the relevant information of input data when the input gap is large. By introducing gate functions into the cell structure, the long short-term memory (LSTM) could handle the problem of l...
Year
DOI
Venue
2019
10.1162/neco_a_01199
Neural Computation
Field
DocType
Volume
Sequential data,Network architecture,Recurrent neural network,Artificial intelligence,Sigma,Machine learning,Mathematics
Journal
31
Issue
ISSN
Citations 
7
0899-7667
37
PageRank 
References 
Authors
2.10
0
4
Name
Order
Citations
PageRank
Yong Yu1393.86
Xiao-Sheng Si262346.17
Chang-Hua Hu348331.18
Jian-Xun Zhang4496.42