Title
Network Pruning via Annealing and Direct Sparsity Control
Abstract
Artificial neural networks (ANNs) especially deep convolutional neural networks are very popular these days and have been proved to successfully offer quite reliable solutions to many vision problems. However, the use of deep neural networks is widely impeded by their intensive computational and memory cost. In this paper, we propose a novel efficient network pruning framework that is suitable for both non-structured and structured channel-level pruning. Our proposed method tightens a sparsity constraint by gradually removing network parameters or filter channels based on a criterion and a schedule. The attractive fact that the network size keeps dropping throughout the iterations makes it suitable for the pruning of any untrained or pre-trained network. Because our method uses a L-0 constraint instead of the L-1 penalty, it does not introduce any bias in the training parameters or filter channels. Furthermore, the L-0 constraint makes it easy to directly specify the desired sparsity level during the network pruning process. Finally, experimental validation on extensive synthetic and real vision datasets show that the proposed method obtains better or competitive performance compared to other states of art network pruning methods.
Year
DOI
Venue
2021
10.1109/IJCNN52387.2021.9533741
2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)
DocType
ISSN
Citations 
Conference
2161-4393
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Yangzi Guo100.68
Yiyuan She214811.66
Adrian Barbu376858.59