Title
Factorized Convolutional Neural Networks.
Abstract
In this paper, we propose to factorize the convolutional layer to reduce its computation. The 3D convolution operation in a convolutional layer can be considered as performing spatial convolution in each channel and linear projection across channels simultaneously. By unravelling them and arranging the spatial convolutions sequentially, the proposed layer is composed of a low-cost single intra-channel convolution and a linear channel projection. When combined with residual connection, it can effectively preserve the spatial information and maintain the accuracy with significantly less computation. We also introduce a topological subdivisioning to reduce the connection between the input and output channels. Our experiments demonstrate that the proposed layers outperform the standard convolutional layers on performance/complexity ratio. Our models achieve similar performance to VGG-16, ResNet-34, ResNet-50, ResNet-101 while requiring 42x,7.32x,4.38x,5.85x less computation respectively.
Year
DOI
Venue
2017
10.1109/iccvw.2017.71
international conference on computer vision
DocType
Volume
Citations 
Conference
abs/1608.04337
5
PageRank 
References 
Authors
0.41
9
3
Name
Order
Citations
PageRank
Min Wang116936.41
Baoyuan Liu21325.64
Hassan Foroosh374859.98