Title
Reviving and Improving Recurrent Back-Propagation.
Abstract
In this paper, we revisit the recurrent back-propagation (RBP) algorithm, discuss the conditions under which it applies as well as how to satisfy them in deep neural networks. We show that RBP can be unstable and propose two variants based on conjugate gradient on the normal equations (CG-RBP) and Neumann series (Neumann-RBP). We further investigate the relationship between Neumann-RBP and back propagation through time (BPTT) and its truncated version (TBPTT). Our Neumann-RBP has the same time complexity as TBPTT but only requires constant memory, whereas TBPTTu0027s memory cost scales linearly with the number of truncation steps. We examine all RBP variants along with BPTT and TBPTT in three different application domains: associative memory with continuous Hopfield networks, document classification in citation networks using graph neural networks and hyperparameter optimization for fully connected networks. All experiments demonstrate that RBPs, especially the Neumann-RBP variant, are efficient and effective for optimizing convergent recurrent neural networks.
Year
Venue
DocType
2018
ICML
Conference
Volume
Citations 
PageRank 
abs/1803.06396
3
0.38
References 
Authors
1
8
Name
Order
Citations
PageRank
Renjie Liao1765.59
Yuwen Xiong21878.44
Ethan Fetaya39612.90
Lisa Zhang442.09
KiJung Yoon550.80
Xaq Pitkow696.84
Raquel Urtasun76810304.97
Richard S. Zemel84958425.68