Title
Learning Wake-Sleep Recurrent Attention Models
Abstract
Despite their success, convolutional neural networks are computationally expensive because they must examine all image locations. Stochastic attention-based models have been shown to improve computational efficiency at test time, but they remain difficult to train because of intractable posterior inference and high variance in the stochastic gradient estimates. Borrowing techniques from the literature on training deep generative models, we present the Wake-Sleep Recurrent Attention Model, a method for training stochastic attention networks which improves posterior inference and which reduces the variability in the stochastic gradients. We show that our method can greatly speed up the training time for stochastic attention networks in the domains of image classification and caption generation.
Year
Venue
Field
2015
Annual Conference on Neural Information Processing Systems
Wake,Convolutional neural network,Inference,Computer science,Attention model,Artificial intelligence,Generative grammar,Contextual image classification,Machine learning,Speedup
DocType
Volume
ISSN
Journal
abs/1509.06812
1049-5258
Citations 
PageRank 
References 
21
1.10
15
Authors
5
Name
Order
Citations
PageRank
Lei Jimmy Ba18887296.55
Ruslan Salakhutdinov212190764.15
Roger B. Grosse31499107.87
Brendan J. Frey43637404.51
Salakhutdinov, Russ R.5211.10