Title
A Unifying Framework for Sparse Gaussian Process Approximation using Power Expectation Propagation.
Abstract
Gaussian processes (GPs) are flexible distributions over functions that enable high-level assumptions about unknown functions to be encoded in a parsimonious, flexible and general way. Although elegant, the application of GPs is limited by computational and analytical intractabilities that arise when data are sufficiently numerous or when employing non-Gaussian models. Consequently, a wealth of GP approximation schemes have been developed over the last 15 years to address these key limitations. Many of these schemes employ a small set of pseudo data points to summarise the actual data. In this paper we develop a new pseudo-point approximation framework using Power Expectation Propagation (Power EP) that unifies a large number of these pseudo-point approximations. Unlike much of the previous venerable work in this area, the new framework is built on standard methods for approximate inference (variational free-energy, EP and power EP methods) rather than employing approximations to the probabilistic generative model itself. In this way all of approximation is performed at `inference timeu0027 rather than at `modelling timeu0027 resolving awkward philosophical and empirical questions that trouble previous approaches. Crucially, we demonstrate that the new framework includes new pseudo-point approximation methods that outperform current approaches on regression, classification and state space modelling tasks.
Year
Venue
Field
2016
arXiv: Machine Learning
Data point,Mathematical optimization,Regression,Approximate inference,Artificial intelligence,Gaussian process,Global Positioning System,Expectation propagation,Small set,State space,Mathematics,Machine learning
DocType
Volume
Citations 
Journal
abs/1605.07066
3
PageRank 
References 
Authors
0.40
19
3
Name
Order
Citations
PageRank
Bui, Thang D.1575.77
Josiah Yan230.40
Richard E. Turner332237.95