Title
Learning to Remember, Forget and Ignore using Attention Control in Memory.
Abstract
Typical neural networks with external memory do not effectively separate capacity for episodic and working memory as is required for reasoning in humans. Applying knowledge gained from psychological studies, we designed a new model called Differentiable Working Memory (DWM) in order to specifically emulate human working memory. As it shows the same functional characteristics as working memory, it robustly learns psychology inspired tasks and converges faster than comparable state-of-the-art models. Moreover, the DWM model successfully generalizes to sequences two orders of magnitude longer than the ones used in training. Our in-depth analysis shows that the behavior of DWM is interpretable and that it learns to have fine control over memory, allowing it to retain, ignore or forget information based on its relevance.
Year
Venue
Field
2018
arXiv: Learning
Working memory,Differentiable function,Artificial intelligence,Artificial neural network,Applying knowledge,Mathematics,Machine learning,Auxiliary memory,Attentional control
DocType
Volume
Citations 
Journal
abs/1809.11087
0
PageRank 
References 
Authors
0.34
6
7
Name
Order
Citations
PageRank
T. S. Jayram1137375.87
Younes Bouhadjar201.35
Ryan L. McAvoy300.68
Tomasz Kornuta45511.95
Alexis Asseman501.01
Kamil Rocki6498.05
Ahmet S. Ozcan701.69