Title
Multi-task Gaussian Process Prediction
Abstract
In this paper we investigate multi-task learning in the context of Gaussian Pro- cesses (GP). We propose a model that learns a shared covariance function on input-dependent features and a "free-form" covariance matrix over tasks. This al- lows for good flexibility when modelling inter-task dependencies while avoiding the need for large amounts of data for training. We show that under the assump- tion of noise-free observations and a block design, predictions for a given task only depend on its target values and therefore a cancellation of inter-task trans- fer occurs. We evaluate the benefits of our model on two practical applications: a compiler performance prediction problem and an exam score prediction task. Additionally, we make use of GP approximations and properties of our model in order to provide scalability to large data sets.
Year
Venue
Keywords
2007
NIPS
gaussian process,block design,covariance matrix,multi task learning,covariance function
Field
DocType
Citations 
Covariance function,Data set,Computer science,Block design,Compiler,Artificial intelligence,Gaussian process,Covariance matrix,Performance prediction,Machine learning,Scalability
Conference
246
PageRank 
References 
Authors
11.37
13
3
Search Limit
100246
Name
Order
Citations
PageRank
Edwin V. Bonilla1100853.32
Kian Ming Adam Chai241722.18
Christopher K. I. Williams36807631.16