Title
Layer-compensated Pruning for Resource-constrained Convolutional Neural Networks.
Abstract
Resource-efficient convolution neural networks enable not only the intelligence on edge devices but also opportunities in system-level optimization such as scheduling. In this work, we aim to improve the performance of resource-constrained filter pruning by merging two sub-problems commonly considered, i.e., (i) how many filters to prune for each layer and (ii) which filters to prune given a per-layer pruning budget, into a global filter ranking problem. Our framework entails a novel algorithm, dubbed layer-compensated pruning, where meta-learning is involved to determine better solutions. We show empirically that the proposed algorithm is superior to prior art in both effectiveness and efficiency. Specifically, we reduce the accuracy gap between the pruned and original networks from 0.9% to 0.7% with 8x reduction in time needed for meta-learning, i.e., from 1 hour down to 7 minutes. To this end, we demonstrate the effectiveness of our algorithm using ResNet and MobileNetV2 networks under CIFAR-10, ImageNet, and Bird-200 datasets.
Year
Venue
Field
2018
arXiv: Computer Vision and Pattern Recognition
Ranking,Convolutional neural network,Scheduling (computing),Convolution,Computer science,Edge device,Artificial intelligence,Merge (version control),Artificial neural network,Machine learning,Pruning
DocType
Volume
Citations 
Journal
abs/1810.00518
3
PageRank 
References 
Authors
0.41
16
3
Name
Order
Citations
PageRank
Ting-Wu Chin1265.66
Cha Zhang21671115.71
Diana Marculescu32725223.87