Title
Task-group Relatedness and Generalization Bounds for Regularized Multi-task Learning.
Abstract
In this paper, we study the generalization performance of regularized multi-task learning (RMTL) in a vector-valued framework, where MTL is considered as a learning process for vector-valued functions. We are mainly concerned with two theoretical questions: 1) under what conditions does RMTL perform better with a smaller task sample size than STL? 2) under what conditions is RMTL generalizable and can guarantee the consistency of each task during simultaneous learning? In particular, we investigate two types of task-group relatedness: the observed discrepancy-dependence measure (ODDM) and the empirical discrepancy-dependence measure (EDDM), both of which detect the dependence between two groups of multiple related tasks (MRTs). We then introduce the Cartesian product-based uniform entropy number (CPUEN) to measure the complexities of vector-valued function classes. By applying the specific deviation and the symmetrization inequalities to the vector-valued framework, we obtain the generalization bound for RMTL, which is the upper bound of the joint probability of the event that there is at least one task with a large empirical discrepancy between the expected and empirical risks. Finally, we present a sufficient condition to guarantee the consistency of each task in the simultaneous learning process, and we discuss how task relatedness affects the generalization performance of RMTL. Our theoretical findings answer the aforementioned two questions.
Year
Venue
Field
2014
CoRR
Mathematical optimization,Joint probability distribution,Multi-task learning,Task group,Upper and lower bounds,Cartesian product,Symmetrization,Artificial intelligence,Machine learning,Mathematics,Sample size determination
DocType
Volume
Citations 
Journal
abs/1408.6617
0
PageRank 
References 
Authors
0.34
13
4
Name
Order
Citations
PageRank
Chao Zhang135163.97
Dacheng Tao219032747.78
Hu Tao3709.94
Xiang Li415847.86