Title
Heteroscedastic Gaussian Process Regression Using Expectation Propagation
Abstract
Gaussian Processes (GPs) are Bayesian non-parametric models that achieve state-of-the-art performance in regression tasks. To allow for analytical tractability, noise power is usually considered constant in these models, which is unrealistic for many real world problems. In this work we consider a GP model with heteroscedastic (i.e., input dependent) noise power, and then, use Expectation Propagation (EP) to perform approximate inference on it. The proposed EP approach is much faster than Markov Chain Monte Carlo and more accurate than competing methods of similar computational cost. This superiority is illustrated in several experiments with synthetic and real-world data.
Year
DOI
Venue
2011
10.1109/MLSP.2011.6064576
2011 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP)
Keywords
Field
DocType
markov chain monte carlo,gaussian process,gaussian process regression,parametric model
Kriging,Noise power,Gaussian random field,Markov chain Monte Carlo,Pattern recognition,Computer science,Approximate inference,Artificial intelligence,Gaussian process,Expectation propagation,Bayesian probability
Conference
ISSN
Citations 
PageRank 
2161-0363
5
0.48
References 
Authors
6
3
Name
Order
Citations
PageRank
Luis Muñoz-González1828.48
Miguel Lázaro-Gredilla238326.46
Aníbal R. Figueiras-Vidal346738.03