Title
Prune it Yourself: Automated Pruning by Multiple Level Sensitivity
Abstract
Deep neural network pruning is to reduce the model size by removing redundant structures and weights. Existing methods focus on single layer information which ignore other layers. And pruning progress is simply removing all weights at the same time. To address these limitations, we propose Prune it Yourself (PIY) framework. First, we collect both filter and channel sensitivity information. Then combine them to decide the structure to be pruned. At last we use the gradual pruning algorithm to reduce the accuracy loss without extra hyper-parameters. We use VGG-16 and ResNet to perform experiments on CIFAR-10 and ImageNet. The experimental results prove the effectiveness of our method.
Year
DOI
Venue
2020
10.1109/MIPR49039.2020.00022
2020 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR)
Keywords
DocType
ISBN
deep neural network,model compression,pruning
Conference
978-1-7281-4273-9
Citations 
PageRank 
References 
0
0.34
2
Authors
4
Name
Order
Citations
PageRank
Zhaoyi Yan101.69
Peiyin Xing200.34
Yaowei Wang313429.62
Yonghong Tian41057102.81