Title
Loss-Driven Channel Pruning Of Convolutional Neural Networks
Abstract
The increase in computation cost and storage of convolutional neural networks (CNNs) severely hinders their applications on limited-resources devices in recent years. As a result, there is impending necessity to accelerate the networks by certain methods. In this paper, we propose a loss-driven method to prune redundant channels of CNNs. It identifies unimportant channels by using Taylor expansion technique regarding to scaling and shifting factors, and prunes those channels by fixed percentile threshold. By doing so, we obtain a compact network with less parameters and FLOPs consumption. In experimental section, we evaluate the proposed method in CIFAR datasets with several popular networks, including VGG-19, DenseNet-40 and ResNet-164, and experimental results demonstrate the proposed method is able to prune over 70% channels and parameters with no performance loss. Moreover, iterative pruning could be used to obtain more compact network.
Year
DOI
Venue
2020
10.1587/transinf.2019EDL8200
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS
Keywords
DocType
Volume
channel pruning, convolutional neural networks, Taylor expansion, fine-tuning, iterative pruning
Journal
E103D
Issue
ISSN
Citations 
5
1745-1361
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Xin Long100.34
Xiangrong Zeng200.34
C. H. Cheng318610.13
Huaxin Xiao4228.41
Maojun Zhang531448.74