Title
Generative Temporal Models with Memory.
Abstract
We consider the general problem of modeling temporal data with long-range dependencies, wherein new observations are fully or partially predictable based on temporally-distant, past observations. A sufficiently powerful temporal model should separate predictable elements of the sequence from unpredictable elements, express uncertainty about those unpredictable elements, and rapidly identify novel elements that may help to predict the future. To create such models, we introduce Generative Temporal Models augmented with external memory systems. They are developed within the variational inference framework, which provides both a practical training methodology and methods to gain insight into the modelsu0027 operation. We show, on a range of problems with sparse, long-term temporal dependencies, that these models store information from early in a sequence, and reuse this stored information efficiently. This allows them to perform substantially better than existing models based on well-known recurrent neural networks, like LSTMs.
Year
Venue
Field
2017
arXiv: Learning
Inference,Reuse,Computer science,Recurrent neural network,Temporal models,Temporal database,Artificial intelligence,Generative grammar,Machine learning,Auxiliary memory
DocType
Volume
Citations 
Journal
abs/1702.04649
10
PageRank 
References 
Authors
0.66
26
8
Name
Order
Citations
PageRank
Mevlana Gemici1231.87
Chia-Chun Hung2403.35
Adam Santoro343820.37
Greg Wayne459231.86
Shakir Mohamed5153871.62
Danilo Jimenez Rezende6156781.67
Amos David7615.40
Timothy P. Lillicrap84377170.65