Abstract | ||
---|---|---|
Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cells or tanh cells are unable to learn the relevant information of input data when the input gap is large. By introducing gate functions into the cell structure, the long short-term memory (LSTM) could handle the problem of l... |
Year | DOI | Venue |
---|---|---|
2019 | 10.1162/neco_a_01199 | Neural Computation |
Field | DocType | Volume |
Sequential data,Network architecture,Recurrent neural network,Artificial intelligence,Sigma,Machine learning,Mathematics | Journal | 31 |
Issue | ISSN | Citations |
7 | 0899-7667 | 37 |
PageRank | References | Authors |
2.10 | 0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yong Yu | 1 | 39 | 3.86 |
Xiao-Sheng Si | 2 | 623 | 46.17 |
Chang-Hua Hu | 3 | 483 | 31.18 |
Jian-Xun Zhang | 4 | 49 | 6.42 |