Title
Tilted Variational Bayes.
Abstract
inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP. Code to reproduce all the experiments can be found at github. com/SheffieldML/TVB.
Year
Venue
Field
2014
JMLR Workshop and Conference Proceedings
Kriging,Inference,Upper and lower bounds,Marginal likelihood,Approximate inference,Artificial intelligence,Gaussian process,Expectation propagation,Machine learning,Mathematics,Bayes' theorem
DocType
Volume
ISSN
Conference
33
1938-7288
Citations 
PageRank 
References 
1
0.35
13
Authors
3
Name
Order
Citations
PageRank
James Hensman126520.05
Max Zwiessele221.05
Neil D. Lawrence33411268.51