Title
Approximate Marginals in Latent Gaussian Models
Abstract
We consider the problem of improving the Gaussian approximate posterior marginals computed by expectation propagation and the Laplace method in latent Gaussian models and propose methods that are similar in spirit to the Laplace approximation of Tierney and Kadane (1986). We show that in the case of sparse Gaussian models, the computational complexity of expectation propagation can be made comparable to that of the Laplace method by using a parallel updating scheme. In some cases, expectation propagation gives excellent estimates where the Laplace approximation fails. Inspired by bounds on the correct marginals, we arrive at factorized approximations, which can be applied on top of both expectation propagation and the Laplace method. The factorized approximations can give nearly indistinguishable results from the non-factorized approximations and their computational complexity scales linearly with the number of variables. We experienced that the expectation propagation based marginal approximations we introduce are typically more accurate than the methods of similar complexity proposed by Rue et al. (2009).
Year
DOI
Venue
2011
10.5555/1953048.1953061
Journal of Machine Learning Research
Keywords
Field
DocType
similar complexity,latent gaussian model,latent gaussian models,computational complexity scales linearly,gaussian approximate posterior,laplace method,sparse gaussian model,laplace approximation,approximate marginals,computational complexity,factorized approximation,expectation propagation
Laplace's method,Gaussian,Artificial intelligence,Expectation propagation,Machine learning,Mathematics,Computational complexity theory
Journal
Volume
ISSN
Citations 
12,
1532-4435
11
PageRank 
References 
Authors
0.96
13
2
Name
Order
Citations
PageRank
Botond Cseke119311.55
Tom Heskes21519198.44