Title
Manifold Regularized Dynamic Network Pruning
Abstract
Neural network pruning is an essential approach for reducing the computational complexity of deep models so that they can be well deployed on resource-limited devices. Compared with conventional methods, the recently developed dynamic pruning methods determine redundant filters variant to each input instance which achieves higher acceleration. Most of the existing methods discover effective subnetworks for each instance independently and do not utilize the relationship between different inputs. To maximally excavate redundancy in the given network architecture, this paper proposes a new paradigm that dynamically removes redundant filters by embedding the manifold information of all instances into the space of pruned networks (dubbed as ManiDP). We first investigate the recognition complexity and feature similarity between images in the training set. Then, the manifold relationship between instances and the pruned sub-networks will be aligned in the training procedure. The effectiveness of the proposed method is verified on several benchmarks, which shows better performance in terms of both accuracy and computational cost compared to the state-of-the-art methods. For example, our method can reduce 55.3% FLOPs of ResNet-34 with only 0.57% top-I accuracy degradation on ImageNet. The code will be available at https://github.com/huawei-noah/Pruning/tree/master/ManiDP.
Year
DOI
Venue
2021
10.1109/CVPR46437.2021.00498
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021
DocType
ISSN
Citations 
Conference
1063-6919
0
PageRank 
References 
Authors
0.34
0
7
Name
Order
Citations
PageRank
Yehui Tang186.41
Yunhe Wang266.82
Yixing Xu395.09
Yiping Deng411.70
Chao Xu5344.24
Dacheng Tao619032747.78
Chang Xu710620.21