Title
Do RNN and LSTM have Long Memory?
Abstract
The LSTM network was proposed to overcome the difficulty in learning long-term dependence, and has made significant advancements in applications. With its success and drawbacks in mind, this paper raises the question - do RNN and LSTM have long memory? We answer it partially by proving that RNN and LSTM do not have long memory from a statistical perspective. A new definition for long memory networks is further introduced, and it requires the gradient to decay hyperbolically. To verify our theory, we convert RNN and LSTM into long memory networks by making a minimal modification, and their superiority is illustrated in modeling long-term dependence of various datasets.
Year
Venue
DocType
2020
ICML
Conference
Citations 
PageRank 
References 
0
0.34
0
Authors
7
Name
Order
Citations
PageRank
Jingyu Zhao100.34
Feiqing Huang200.34
Jia Lv300.68
Yanjie Duan400.34
Zhen Qin500.34
Guodong Li623231.77
Guangjian Tian7144.56