Title
Non-Negative Transfer Learning With Consistent Inter-Domain Distribution
Abstract
In this letter, we propose a novel transfer learning approach, which simultaneously exploits the intra-domain differentiation and inter-domain correlation to comprehensively solve the drawbacks many existing transfer learning methods suffer from, i.e., they either are unable to handle the negative samples or have strict assumptions on the distribution. Specifically, the sample selection strategy is introduced to handle negative samples by using the local geometry structure and the label information of source samples. Furthermore, the pseudo target label is imposed to slack the assumption on the inter-domain distribution for considering the inter-domain correlation. Then, an efficient alternating iterative algorithm is proposed to solve the formulated optimization problem with multiple constraints. The extensive experiments conducted on eleven real-world datasets show the superiority of our method over state-of-the-art approaches, i.e., our method achieves 11.23% improvement on the MNIST dataset.
Year
DOI
Venue
2020
10.1109/LSP.2020.3025061
IEEE SIGNAL PROCESSING LETTERS
Keywords
DocType
Volume
Correlation, Optimization, Knowledge transfer, Task analysis, Learning systems, Signal processing algorithms, Kernel, Transfer learning, inter-domain distribution, negative transfer
Journal
27
Issue
ISSN
Citations 
99
1070-9908
1
PageRank 
References 
Authors
0.35
0
3
Name
Order
Citations
PageRank
Zhihao Peng141.73
Yuheng Jia29313.13
Junhui Hou339549.84