Title
Domain Invariant Transfer Kernel Learning
Abstract
Domain transfer learning generalizes a learning model across training data and testing data with different distributions. A general principle to tackle this problem is reducing the distribution difference between training data and testing data such that the generalization error can be bounded. Current methods typically model the sample distributions in input feature space, which depends on nonlinear feature mapping to embody the distribution discrepancy. However, this nonlinear feature space may not be optimal for the kernel-based learning machines. To this end, we propose a transfer kernel learning (TKL) approach to learn a domain-invariant kernel by directly matching source and target distributions in the reproducing kernel Hilbert space (RKHS). Specifically, we design a family of spectral kernels by extrapolating target eigensystem on source samples with Mercer’s theorem. The spectral kernel minimizing the approximation error to the ground truth kernel is selected to construct domain-invariant kernel machines. Comprehensive experimental evidence on a large number of text categorization, image classification, and video event recognition datasets verifies the effectiveness and efficiency of the proposed TKL approach over several state-of-the-art methods.
Year
DOI
Venue
2015
10.1109/TKDE.2014.2373376
Knowledge and Data Engineering, IEEE Transactions  
Keywords
Field
DocType
nystr??m method,nystrom method,transfer learning,image classification,kernel learning,text mining,video recognition,approximation error,learning artificial intelligence,testing,kernel,hilbert space,hilbert spaces
Pattern recognition,Radial basis function kernel,Computer science,Kernel embedding of distributions,Kernel principal component analysis,Polynomial kernel,Artificial intelligence,String kernel,Kernel method,Variable kernel density estimation,Machine learning,Kernel (statistics)
Journal
Volume
Issue
ISSN
27
6
1041-4347
Citations 
PageRank 
References 
49
1.26
37
Authors
4
Name
Order
Citations
PageRank
Mingsheng Long1142155.15
Jianmin Wang22446156.05
Jia-guang Sun31807134.30
Philip S. Yu4306703474.16