Title
Learning to Transduce with Unbounded Memory
Abstract
Recently, strong results have been demonstrated by Deep Recurrent Neural Networks on natural language transduction problems. In this paper we explore the representational power of these models using synthetic grammars designed to exhibit phenomena similar to those found in real transduction problems such as machine translation. These experiments lead us to propose new memory-based recurrent networks that implement continuously differentiable analogues of traditional data structures such as Stacks, Queues, and DeQues. We show that these architectures exhibit superior generalisation performance to Deep RNNs and are often able to learn the underlying generating algorithms in our transduction experiments.
Year
Venue
Field
2015
Annual Conference on Neural Information Processing Systems
Rule-based machine translation,Data structure,Computer science,Generalization,Machine translation,Queue,Recurrent neural network,Natural language,Artificial intelligence,Machine learning
DocType
Volume
ISSN
Journal
abs/1506.02516
1049-5258
Citations 
PageRank 
References 
69
3.05
15
Authors
4
Name
Order
Citations
PageRank
Edward Grefenstette1174380.65
Karl Moritz Hermann2113147.50
mustafa suleyman362824.43
Phil Blunsom43130152.18