Title
Continual Learning Long Short Term Memory.
Abstract
Catastrophic forgetting in neural networks indicates the performance decreasing of deep learning models on previous tasks while learning new tasks. To address this problem, we propose a novel Continual Learning Long Short Term Memory (CL-LSTM) cell in Recurrent Neural Network (RNN) in this paper. CL-LSTM considers not only the state of each individual task’s output gates but also the correlation of the states between tasks, so that the deep learning models can incrementally learn new tasks without catastrophically forgetting previously tasks. Experimental results demonstrate significant improvements of CL-LSTM over state-of-the-art approaches on spoken language understanding (SLU) tasks.
Year
DOI
Venue
2020
10.18653/V1/2020.FINDINGS-EMNLP.164
EMNLP
DocType
Volume
Citations 
Conference
2020.findings-emnlp
0
PageRank 
References 
Authors
0.34
0
7
Name
Order
Citations
PageRank
Xin Guo13115.25
Yu Tian200.34
Qinghan Xue300.68
Panos Lampropoulos400.34
Steven Eliuk500.68
Kenneth E. Barner681270.19
Xiaolong Wang700.68