Title
Positive-Unlabeled Compression on the Cloud
Abstract
Many attempts have been done to extend the great success of convolutional neural networks (CNNs) achieved on high-end GPU servers to portable devices such as smart phones. Providing compression and acceleration service of deep learning models on the cloud is therefore of significance and is attractive for end users. However, existing network compression and acceleration approaches usually fine-tuning the svelte model by requesting the entire original training data (e.g. ImageNet), which could be more cumbersome than the network itself and cannot be easily uploaded to the cloud. In this paper, we present a novel positive-unlabeled (PU) setting for addressing this problem. In practice, only a small portion of the original training set is required as positive examples and more useful training examples can be obtained from the massive unlabeled data on the cloud through a PU classifier with an attention based multi-scale feature extractor. We further introduce a robust knowledge distillation (RKD) scheme to deal with the class imbalance problem of these newly augmented training examples. The superiority of the proposed method is verified through experiments conducted on the benchmark models and datasets. We can use only 8% of uniformly selected data from the ImageNet to obtain an efficient model with comparable performance to the baseline ResNet-34.
Year
Venue
Keywords
2019
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)
convolutional neural network,end users,portable devices,smart phones
Field
DocType
Volume
Compression (physics),Computer science,Artificial intelligence,Machine learning,Cloud computing
Conference
32
ISSN
Citations 
PageRank 
1049-5258
2
0.36
References 
Authors
0
7
Name
Order
Citations
PageRank
Yixing Xu195.09
Yunhe Wang211322.76
hanting chen3297.32
Kai Han45511.16
Chunjing Xu56116.98
Dacheng Tao619032747.78
Chang Xu778147.60