Abstract | ||
---|---|---|
In topic modelling, the main computational problem is to approximate the posterior distribution given an observed collection. Commonly, we must resort to variational methods for approximations; however, we do not know which variational variant is the best choice under certain settings. In this paper, we focus on four topic modelling inference methods, including mean-field variation Bayesian, collapsed variational Bayesian, hybrid variational-Gibbs and expectation propagation, and aim to systematically compare them. We analyse them from two perspectives, i.e. the approximate posterior distribution and the type of alpha-divergence; and then empirically compare them on various data-sets by two popular metrics. The empirical results are almost matching our analysis, where they indicate that CVB0 may be the best variational variant for topic models. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1080/0952813X.2017.1409277 | JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE |
Keywords | Field | DocType |
Topic modelling,variational methods,variational distributions,alpha-divergence | Computational problem,Inference,Computer science,Posterior probability,Artificial intelligence,Topic model,Expectation propagation,Empirical research,Machine learning,Bayesian probability | Journal |
Volume | Issue | ISSN |
30.0 | 1 | 0952-813X |
Citations | PageRank | References |
0 | 0.34 | 15 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jinjin Chi | 1 | 15 | 3.41 |
Jihong OuYang | 2 | 94 | 15.66 |
Ximing Li | 3 | 44 | 13.97 |
Changchun Li | 4 | 11 | 1.89 |