Title
Exploring deep reuse in winograd CNN inference
Abstract
ABSTRACTConvolutional neural networks (CNNs), as representatives of deep learning, are one of the most commonly used neural networks in applications such as graphic image analysis. However, CNN has heavy computation patterns; network training processes could take several hours even with modern processors. Different from the training process, the inference process is more often executed on devices with low computing power, such as CPUs. Fortunately, a minimal filtering algorithm, Winograd, can reduce the convolution computations by reducing the number of multiplication operations. We find that the Winograd convolution can be further accelerated by reusing the similar data and computation patterns, which is called deep reuse.
Year
DOI
Venue
2021
10.1145/3437801.3441588
PPOPP
DocType
Citations 
PageRank 
Conference
1
0.35
References 
Authors
0
5
Name
Order
Citations
PageRank
Ruofan Wu111.03
Feng Zhang210.35
Zhen Zheng310.69
Xiaoyong Du4104.70
Xipeng Shen52025118.55