Abstract | ||
---|---|---|
The objective of Multi-Task Learning (MTL) is to boost learning performance by simultaneously learning multiple relevant tasks. Identifying and modeling the task relationship is essential for multi-task learning. Most previous works assume that related tasks have common shared structure. However, this assumption is too restrictive. In some real-world applications, relevant tasks are partially sharing knowledge at the feature level. In other words, the relevant features of related tasks can partially overlap. In this paper, we propose a new MTL approach to exploit this partial relationship of tasks, which is able to selectively exploit shared information across the tasks while produce a task-specific sparse pattern for each task. Therefore, this increased flexibility is able to model the complex structure among tasks. An efficient alternating optimization has been developed to optimize the model. We perform experimental studies on real world data and the results demonstrate that the proposed method significantly improves learning performance by simultaneously exploiting the partial relationship across tasks at the feature level. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1007/978-3-319-70139-4_10 | NEURAL INFORMATION PROCESSING, ICONIP 2017, PT V |
Keywords | Field | DocType |
Multi-Task Learning, Partially task relationship | Multi-task learning,Computer science,Exploit,Artificial intelligence,Machine learning | Conference |
Volume | ISSN | Citations |
10638 | 0302-9743 | 0 |
PageRank | References | Authors |
0.34 | 9 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Cheng Liu | 1 | 33 | 5.72 |
Wen-Ming Cao | 2 | 26 | 11.53 |
Chutao Zheng | 3 | 9 | 1.90 |
Hau-San Wong | 4 | 1008 | 86.89 |