Title
Feature Selection via Transferring Knowledge Across Different Classes
Abstract
The problem of feature selection has attracted considerable research interest in recent years. Supervised information is capable of significantly improving the quality of selected features. However, existing supervised feature selection methods all require that classes in the labeled data (source domain) and unlabeled data (target domain) to be identical, which may be too restrictive in many cases. In this article, we consider a more challenging cross-class setting where the classes in these two domains are related but different, which has rarely been studied before. We propose a cross-class knowledge transfer feature selection framework which transfers the cross-class knowledge from the source domain to guide target domain feature selection. Specifically, high-level descriptions, i.e., attributes, are used as the bridge for knowledge transfer. To further improve the quality of the selected features, our framework jointly considers the tasks of cross-class knowledge transfer and feature selection. Experimental results on four benchmark datasets demonstrate the superiority of the proposed method.
Year
DOI
Venue
2019
10.1145/3314202
ACM Transactions on Knowledge Discovery from Data (TKDD)
Keywords
Field
DocType
Feature selection, dimension reduction, supervision transfer
Data mining,Dimensionality reduction,Feature selection,Computer science,Knowledge transfer,Artificial intelligence,Labeled data,Machine learning
Journal
Volume
Issue
ISSN
13
2
1556-4681
Citations 
PageRank 
References 
2
0.37
0
Authors
4
Name
Order
Citations
PageRank
Zheng Wang17247.08
Xiaojun Ye218528.48
Chaokun Wang341146.63
Philip S. Yu4306703474.16