Title
Towards Thinner Convolutional Neural Networks Through Gradually Global Pruning
Abstract
Deep network pruning is an effective method to reduce the storage and computation cost of deep neural networks when applying them to resource-limited devices. Among many pruning granularities, neuron level pruning will remove redundant neurons and filters in the model and result in thinner networks. In this paper, we propose a gradually global pruning scheme for neuron level pruning. In each pruning step, a small percent of neurons were selected and dropped across all layers in the model. We also propose a simple method to eliminate the biases in evaluating the importance of neurons to make the scheme feasible. Compared with layer-wise pruning scheme, our scheme avoid the difficulty in determining the redundancy in each layer and is more effective for deep networks. Our scheme would automatically find a thinner sub-network in original network under a given performance.
Year
Venue
Keywords
2017
2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP)
Artificial neural networks, Deep learning, Deep compression
DocType
Volume
ISSN
Conference
abs/1703.09916
1522-4880
Citations 
PageRank 
References 
0
0.34
15
Authors
5
Name
Order
Citations
PageRank
Zhengtao Wang1783.40
Ce Zhu21473117.79
Zhiqiang Xia300.34
Qi Guo44312.11
Yipeng Liu5435.93