Title
Architectural Complexity Measures of Recurrent Neural Networks.
Abstract
In this paper, we systematically analyze the connecting architectures of recurrent neural networks (RNNs). Our main contribution is twofold: first, we present a rigorous graph-theoretic framework describing the connecting architectures of RNNs in general. Second, we propose three architecture complexity measures of RNNs: (a) the recurrent depth, which captures the RNN's over-time nonlinear complexity, (b) the feedforward depth, which captures the local input-output nonlinearity (similar to the "depth" in feedforward neural networks (FNNs)), and (c) the recurrent skip coefficient which captures how rapidly the information propagates over time. We rigorously prove each measure's existence and computability. Our experimental results show that RNNs might benefit from larger recurrent depth and feedforward depth. We further demonstrate that increasing recurrent skip coefficient offers performance boosts on long term dependency problems.
Year
Venue
DocType
2016
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016)
Conference
Volume
ISSN
Citations 
29
1049-5258
21
PageRank 
References 
Authors
0.99
19
8
Name
Order
Citations
PageRank
Saizheng Zhang11257.26
Wu, Yuhuai21589.68
tong che3806.13
Lin, Zhouhan4210.99
Roland Memisevic5111665.87
Ruslan Salakhutdinov612190764.15
Yoshua Bengio7426773039.83
Salakhutdinov, Russ R.8210.99