Title
FOCM: Faster Octave Convolution Using Mix-scaling
Abstract
Octave convolution that separates the feature maps for different resolutions is an effective method to reduce the spatial redundancy in Convolution Neural Networks (CNN). In this paper, we propose a faster version of octave convolution, FOCM, which can further reduce the computation cost of CNNs. Similar to the octave convolution, FOCM divides the input and output feature maps into the domains of different resolutions, but without explicit information exchange among them. In addition, FOCM utilizes the mix-scaled convolution kernels to learn different sized spatial features. Experiments on various depth ResNet with ImageNet data-set have shown that FOCM can reduce 33.9% to 46.4% operations of the original models, and save 11.1% to 21.7% FLOPS of the models using octave convolutions, with similar top-1 and top-5 accuracy.
Year
DOI
Venue
2021
10.1109/TAAI54685.2021.00015
2021 International Conference on Technologies and Applications of Artificial Intelligence (TAAI)
Keywords
DocType
ISSN
octave convolution,mix-scaling,multiple resolution
Conference
2376-6816
ISBN
Citations 
PageRank 
978-1-6654-0826-4
0
0.34
References 
Authors
1
3
Name
Order
Citations
PageRank
Kuan-Hsian Hsieh100.34
Erh-Chung Chen200.34
Che-Rung Lee396.64