Title
Using Multi-task and Transfer Learning to Solve Working Memory Tasks
Abstract
We propose a new architecture called Memory-Augmented Encoder-Solver (MAES) that enables transfer learning to solve complex working memory tasks adapted from cognitive psychology. It uses dual recurrent neural network controllers, inside the encoder and solver, respectively, that interface with a shared memory module and is completely differentiable. We study different types of encoders in a systematic manner and demonstrate a unique advantage of multi-task learning in obtaining the best possible encoder. We show by extensive experimentation that the trained MAES models achieve task-size generalization, i.e., they are capable of handling sequential inputs 50 times longer than seen during training, with appropriately large memory modules. We demonstrate that the performance achieved by MAES far outperforms existing and well-known models such as the LSTM, NTM and DNC on the entire suite of tasks.
Year
DOI
Venue
2018
10.1109/ICMLA.2018.00045
2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA)
Keywords
DocType
Volume
encoder-decoder,memory-augmented neural networks,working memory,transfer learning,multi-task learning
Conference
abs/1809.10847
ISBN
Citations 
PageRank 
978-1-5386-6806-1
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
T. S. Jayram1137375.87
Tomasz Kornuta25511.95
Ryan L. McAvoy300.34
Ahmet S. Ozcan400.34