Title
Speeding up Convolutional Neural Networks By Exploiting the Sparsity of Rectifier Units.
Abstract
Rectifier neuron units (ReLUs) have been widely used in deep convolutional networks. An ReLU converts negative values to zeros, and does not change positive values, which leads to a high sparsity of neurons. In this work, we first examine the sparsity of the outputs of ReLUs in some popular deep convolutional architectures. And then we use the sparsity property of ReLUs to accelerate the calculation of convolution by skipping calculations of zero-valued neurons. The proposed sparse convolution algorithm achieves some speedup improvements on CPUs compared to the traditional matrix-matrix multiplication algorithm for convolution when the sparsity is not less than 0.9.
Year
Venue
Field
2017
arXiv: Computer Vision and Pattern Recognition
Rectifier,Multiplication algorithm,Pattern recognition,Convolution,Computer science,Convolutional neural network,Artificial intelligence,Machine learning,Speedup
DocType
Volume
Citations 
Journal
abs/1704.07724
2
PageRank 
References 
Authors
0.38
11
2
Name
Order
Citations
PageRank
Shaohuai Shi1414.62
Xiaowen Chu21273101.81