Title
Recurrent Neural Networks With External Addressable Long-Term and Working Memory for Learning Long-Term Dependences.
Abstract
Learning long-term dependences (LTDs) with recurrent neural networks (RNNs) is challenging due to their limited internal memories. In this paper, we propose a new external memory architecture for RNNs called an external addressable long-term and working memory (EALWM)-augmented RNN. This architecture has two distinct advantages over existing neural external memory architectures, namely the division of the external memory into two parts—long-term memory and working memory—with both addressable and the capability to learn LTDs without suffering from vanishing gradients with necessary assumptions. The experimental results on algorithm learning, language modeling, and question answering demonstrate that the proposed neural memory architecture is promising for practical applications.
Year
DOI
Venue
2020
10.1109/TNNLS.2019.2910302
IEEE transactions on neural networks and learning systems
Keywords
Field
DocType
Recurrent neural networks,Logic gates,Random access memory,Memory architecture,Microprocessors
Architecture,Question answering,Computer science,Working memory,Recurrent neural network,Artificial intelligence,Language model,Memory architecture,Machine learning,Auxiliary memory
Journal
Volume
Issue
ISSN
31
3
2162-237X
Citations 
PageRank 
References 
3
0.38
8
Authors
6
Name
Order
Citations
PageRank
Zhibin Quan1162.65
Weili Zeng231.05
Xuelian Li330.38
Yandong Liu430.38
Yunxiu Yu530.72
Wankou Yang619926.33