Title
Learning Multiple Timescales In Recurrent Neural Networks
Abstract
Recurrent Neural Networks (RNNs) are powerful architectures for sequence learning. Recent advances on the vanishing gradient problem have led to improved results and an increased research interest. Among recent proposals are architectural innovations that allow the emergence of multiple timescales during training. This paper explores a number of architectures for sequence generation and prediction tasks with long-term relationships. We compare the Simple Recurrent Network (SRN) and Long Short-Term Memory (LSTM) with the recently proposed Clockwork RNN (CWRNN), Structurally Constrained Recurrent Network (SCRN), and Recurrent Plausibility Network (RPN) with regard to their capabilities of learning multiple timescales. Our results show that partitioning hidden layers under distinct temporal constraints enables the learning of multiple timescales, which contributes to the understanding of the fundamental conditions that allow RNNs to self-organize to accurate temporal abstractions.
Year
DOI
Venue
2016
10.1007/978-3-319-44778-0_16
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I
Keywords
Field
DocType
Recurrent Neural Networks, Sequence learning, Multiple timescales, Leaky activation, Clocked activation
Clockwork,Computer science,Recurrent neural network,Artificial intelligence,Sequence learning,Machine learning,Vanishing gradient problem
Conference
Volume
ISSN
Citations 
9886
0302-9743
2
PageRank 
References 
Authors
0.38
6
3
Name
Order
Citations
PageRank
Tayfun Alpay153.15
Stefan Heinrich2285.50
Stefan Wermter31100151.62