Title
MCMC for Variationally Sparse Gaussian Processes
Abstract
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is large; how to approximate the posterior when the likelihood is not Gaussian and how to estimate covariance function parameter posteriors. This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support of the function but otherwise free-form. The result is a Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Code to replicate each experiment in this paper is available at github.com/sparseMCMC.
Year
Venue
Field
2015
Annual Conference on Neural Information Processing Systems
Covariance function,Markov chain Monte Carlo,Computer science,Gaussian,Artificial intelligence,Gaussian process,Probabilistic logic,Replicate,Machine learning,Covariance,Computation
DocType
Volume
ISSN
Conference
28
1049-5258
Citations 
PageRank 
References 
12
0.69
22
Authors
4
Name
Order
Citations
PageRank
James Hensman126520.05
Alexander G. de G. Matthews2492.72
Maurizio Filippone361839.58
Zoubin Ghahramani4104551264.39