Title
Mitigation of catastrophic forgetting in recurrent neural networks using a Fixed Expansion Layer.
Abstract
Catastrophic forgetting (or catastrophic interference) in supervised learning systems is the drastic loss of previously stored information caused by the learning of new information. While substantial work has been published on addressing catastrophic forgetting in memoryless supervised learning systems (e.g. feedforward neural networks), the problem has received limited attention in the context of dynamic systems, particularly recurrent neural networks. In this paper, we introduce a solution for mitigating catastrophic forgetting in RNNs based on enhancing the Fixed Expansion Layer (FEL) neural network which exploits sparse coding of hidden neuron activations. Simulation results on several non-stationary data sets clearly demonstrate the effectiveness of the proposed architecture.
Year
DOI
Venue
2013
10.1109/IJCNN.2013.6707047
IJCNN
Keywords
Field
DocType
feedforward neural nets,learning (artificial intelligence),recurrent neural nets,FEL neural network,RNN,catastrophic forgetting mitigation,catastrophic interference,dynamic systems,feedforward neural networks,fixed expansion layer,hidden neuron activations,memoryless supervised learning systems,nonstationary data sets,recurrent neural networks,sparse coding
Forgetting,Feedforward neural network,Computer science,Recurrent neural network,Time delay neural network,Types of artificial neural networks,Artificial intelligence,Deep learning,Artificial neural network,Catastrophic interference,Machine learning
Conference
ISSN
Citations 
PageRank 
2161-4393
1
0.35
References 
Authors
0
2
Name
Order
Citations
PageRank
Robert Coop1806.66
Itamar Arel216018.56