Title
Variational Information Distillation For Knowledge Transfer
Abstract
Transferring knowledge from a teacher neural network pretrained on the same or a similar task to a student neural network can significantly improve the performance of the student neural network. Existing knowledge transfer approaches match the activations or the corresponding handcrafted-features of the teacher and the student networks. We propose an information-theoretic framework for knowledge transfer which formulates knowledge transfer as maximizing the mutual information between the teacher and the student networks. We compare our method with existing knowledge transfer methods on both knowledge distillation and transfer learning tasks and show that our method consistently outperforms existing methods. We further demonstrate the strength of our method on knowledge transfer across heterogeneous network architectures by transferring knowledge from a convolutional neural network (CNN) to a multi-layer perceptron (MLP) on CIFAR-10. The resulting MLP significantly outperforms the-state-of-the-art methods and it achieves similar performance to the CNN with a single convolutional layer.
Year
DOI
Venue
2019
10.1109/CVPR.2019.00938
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019)
Field
DocType
Volume
Computer vision,Engineering drawing,Computer science,Knowledge transfer,Distillation,Artificial intelligence
Journal
abs/1904.05835
ISSN
Citations 
PageRank 
1063-6919
12
0.79
References 
Authors
0
5
Name
Order
Citations
PageRank
Sungsoo Ahn1121.13
Xu Hu2364.46
andreas damianou315117.68
Neil D. Lawrence43411268.51
Zhenwen Dai59313.83