Title
Fast Generation for Convolutional Autoregressive Models.
Abstract
Convolutional autoregressive models have recently demonstrated state-of-the-art performance on a number of generation tasks. While fast, parallel training methods have been crucial for their success, generation is typically implemented in a naive fashion where redundant computations are unnecessarily repeated. This results in slow generation, making such models infeasible for production environments. In this work, we describe a method to speed up generation in convolutional autoregressive models. The key idea is to cache hidden states to avoid redundant computation. We apply our fast generation method to the Wavenet and PixelCNN++ models and achieve up to $21times$ and $183times$ speedups respectively.
Year
Venue
Field
2017
ICLR
Autoregressive model,Cache,Computer science,Artificial intelligence,Machine learning,Speedup,Computation
DocType
Volume
Citations 
Journal
abs/1704.06001
9
PageRank 
References 
Authors
0.55
7
9
Name
Order
Citations
PageRank
Prajit Ramachandran1516.24
Tom Le Paine2863.70
Pooya Khorrami31186.27
Mohammad Babaeizadeh4706.52
Shiyu Chang577051.07
Yang Zhang6395.01
Mark Hasegawa-Johnson71189112.85
Roy Campbell85133573.61
Thomas S. Huang9278152618.42