Title
Scalable Gaussian Process Classification via Expectation Propagation
Abstract
Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involving large datasets show that the method described is competitive with the variational approach.
Year
Venue
Field
2016
JMLR Workshop and Conference Proceedings
Computer science,Marginal likelihood,Artificial intelligence,Gaussian process,Expectation propagation,Stochastic approximation,Scaling,Machine learning,Scalability,Computation
DocType
Volume
ISSN
Conference
51
1938-7288
Citations 
PageRank 
References 
0
0.34
7
Authors
2
Name
Order
Citations
PageRank
Daniel Hernández-Lobato144026.10
José Miguel Hernández-Lobato261349.06