Abstract | ||
---|---|---|
Transfer learning techniques have witnessed a significant development in real applications where the knowledge from previous tasks are required to reduce the high cost of inquiring the labeled information for the target task. However, how to avoid negative transfer which happens due to different distributions of tasks in heterogeneous environment is still a open problem. In order to handle this kind of issue, we propose a Compact Coding method for Hyperplane Classifiers (CCHC) under a two-level framework in inductive transfer learning setting. Unlike traditional methods, we measure the similarities among tasks from the macro level perspective through minimum encoding. Particularly speaking, the degree of the similarity is represented by the relevant code length of the class boundary of each source task with respect to the target task. In addition, informative parts of the source tasks are adaptively selected in the micro level viewpoint to make the choice of the specific source task more accurate. Extensive experiments show the effectiveness of our algorithm in terms of the classification accuracy in both UCI and text data sets. |
Year | DOI | Venue |
---|---|---|
2011 | 10.1007/978-3-642-23808-6_14 | ECML/PKDD (3) |
Keywords | Field | DocType |
hyperplane classifier,compact coding method,macro level perspective,inductive transfer,compact coding,specific source task,source task,hyperplane classifiers,micro level viewpoint,previous task,target task,heterogeneous environment,negative transfer | Multi-task learning,Negative transfer,Inductive transfer,Transfer of learning,Coding (social sciences),Artificial intelligence,Hyperplane,Macro,Mathematics,Machine learning,Encoding (memory) | Conference |
Volume | ISSN | Citations |
6913 | 0302-9743 | 4 |
PageRank | References | Authors |
0.39 | 22 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hao Shao | 1 | 8 | 1.11 |
Bin Tong | 2 | 40 | 8.11 |
Einoshin Suzuki | 3 | 853 | 93.41 |