Title
Relational recurrent neural networks.
Abstract
Memory-based neural networks model temporal data by leveraging an ability to remember information for long periods. It is unclear, however, whether they also have an ability to perform complex relational reasoning with the information they remember. Here, we first confirm our intuitions that standard memory architectures may struggle at tasks that heavily involve an understanding of the ways in which entities are connected - i.e., tasks involving relational reasoning. We then improve upon these deficits by using a new memory module - a Relational Memory Core (RMC) - which employs multi-head dot product attention to allow memories to interact. Finally, we test the RMC on a suite of tasks that may profit from more capable relational reasoning across sequential information, and show large gains in RL domains (e.g. Mini PacMan), program evaluation, and language modeling, achieving state-of-the-art results on the WikiText-103, Project Gutenberg, and GigaWord datasets.
Year
Venue
Keywords
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
neural networks,language modeling,recurrent neural networks,project gutenberg,program evaluation,memory module,relational recurrent neural networks
DocType
Volume
ISSN
Conference
31
1049-5258
Citations 
PageRank 
References 
9
0.46
20
Authors
10
Name
Order
Citations
PageRank
Adam Santoro143820.37
Ryan Faulkner21084.48
David Raposo31665.24
Jack Rae4758.77
mike chrzanowski530912.21
Theophane Weber615916.79
Daan Wierstra75412255.92
Oriol Vinyals89419418.45
Razvan Pascanu92596199.21
Timothy P. Lillicrap104377170.65