Title
Towards effective deep transfer via attentive feature alignment
Abstract
Training a deep convolutional network from scratch requires a large amount of labeled data, which however may not be available for many practical tasks. To alleviate the data burden, a practical approach is to adapt a pre-trained model learned on the large source domain to the target domain, but the performance can be limited when the source and target domain data distributions have large differences. Some recent works attempt to alleviate this issue by imposing feature alignment over the intermediate feature maps between the source and target networks. However, for a source model, many of the channels/spatial-features for each layer can be irrelevant to the target task. Thus, directly applying feature alignment may not achieve promising performance. In this paper, we propose an Attentive Feature Alignment (AFA) method for effective domain knowledge transfer by identifying and attending on the relevant channels and spatial features between two domains. To this end, we devise two learnable attentive modules at both the channel and spatial levels. We then sequentially perform attentive spatial- and channel-level feature alignments between the source and target networks, in which the target model and attentive module are learned simultaneously. Moreover, we theoretically analyze the generalization performance of our method, which confirms its superiority to existing methods. Extensive experiments on both image classification and face recognition demonstrate the effectiveness of our method. The source code and the pre-trained models are available at https://github.com/xiezheng-cs/AFAhttps://github.com/xiezheng-cs/AFA.
Year
DOI
Venue
2021
10.1016/j.neunet.2021.01.022
Neural Networks
Keywords
DocType
Volume
Deep transfer,Knowledge distillation,Attention mechanism
Journal
138
Issue
ISSN
Citations 
1
0893-6080
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Zheng Xie100.34
Zhiquan Wen201.01
Yaowei Wang313429.62
Wu Qingyao425933.46
Rui Tang518819.22