Title
A recurrent neural network without chaos.
Abstract
We introduce an exceptionally simple gated recurrent neural network (RNN) that achieves performance comparable to well-known gated architectures, such as LSTMs and GRUs, on the word-level language modeling task. We prove that our model has simple, predicable and non-chaotic dynamics. This stands in stark contrast to more standard gated architectures, whose underlying dynamical systems exhibit chaotic behavior.
Year
Venue
Field
2016
international conference on learning representations
Computer science,Recurrent neural network,Dynamical systems theory,Predicable,Artificial intelligence,Chaotic,Machine learning,Language model
DocType
Volume
Citations 
Journal
abs/1612.06212
1
PageRank 
References 
Authors
0.35
0
2
Name
Order
Citations
PageRank
Laurent, Thomas1747.43
James H. von Brecht2936.45