Abstract | ||
---|---|---|
Multitask learning algorithms are typically designed assuming some fixed, a priori known latent structure shared by all the tasks. However, it is usually unclear what type of latent task structure is the most appropriate for a given multitask learning problem. Ideally, the "right" latent task structure should be learned in a data-driven manner. We present a flexible, nonparametric Bayesian model that posits a mixture of factor analyzers structure on the tasks. The nonparametric aspect makes the model expressive enough to subsume many existing models of latent task structures (e.g, mean-regularized tasks, clustered tasks, low-rank or linear/non-linear subspace assumption on tasks, etc.). Moreover, it can also learn more general task structures, addressing the shortcomings of such models. We present a variational inference algorithm for our model. Experimental results on synthetic and real-world datasets, on both regression and classification problems, demonstrate the effectiveness of the proposed method. |
Year | Venue | Field |
---|---|---|
2012 | ICML | Multi-task learning,Pattern recognition,Subspace topology,Inference,Computer science,A priori and a posteriori,Nonparametric statistics,Bayesian network,Artificial intelligence,Probabilistic latent semantic analysis,Cluster analysis,Machine learning |
DocType | Citations | PageRank |
Conference | 20 | 0.86 |
References | Authors | |
24 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Passos, Alexandre | 1 | 4083 | 167.18 |
Piyush Rai | 2 | 604 | 36.79 |
Jacques Wainer | 3 | 912 | 89.74 |
Hal Daumé, III | 4 | 3673 | 200.05 |