Title
Approximate posterior inference for Bayesian models: black-box expectation propagation
Abstract
Expectation propagation (EP) is a widely successful way to approximate the posteriors of complex Bayesian models. However, it suffers from expensive memory and time overheads, since it involves local approximations with locally specific messages. A recent art, namely averaged EP (AEP), upgrades EP by leveraging the average message effect on the posterior distribution, instead of the locally specific ones, so as to simultaneously reduce memory and time costs. In this paper, we extend AEP to a novel black-box expectation propagation (abbr. BBEP) algorithm, which can be directly applied to many Bayesian models without model-specific derivations. We leverage three ideas of black-box learning, leading to three versions of BBEP, referred to as BBEP $$^{{\varvec{m}}}$$ , BBEP $$^{{\varvec{g}}}$$ and BBEP $$^{{\varvec{o}}}$$ with Monte Carlo moment matching, Monte Carlo gradients and objective of AEP, respectively. For variance reduction, the importance sampling is used, and the proposal distribution selection as well as high dimensionality setting is discussed. Furthermore, we develop online versions of BBEP for optimization speedup given large-scale data sets. We empirically compare BBEP against the state-of-the-art black-box baseline algorithms on both synthetic and real-world data sets. Experimental results demonstrate that BBEP outperforms the baseline algorithms and it is even on a par with analytical solutions in some settings.
Year
DOI
Venue
2022
10.1007/s10115-022-01705-5
Knowledge and Information Systems
Keywords
DocType
Volume
Black-box inference, Expectation propagation, Variance reduction, Importance sampling
Journal
64
Issue
ISSN
Citations 
9
0219-1377
0
PageRank 
References 
Authors
0.34
6
4
Name
Order
Citations
PageRank
Ximing Li14413.97
Li Changchun200.34
Chi Jinjin300.34
Jihong OuYang49415.66