Abstract | ||
---|---|---|
Recent advances on the scalability and flexibility of variational inference have made it successful at unravelling hidden patterns in complex data. In this work we propose a new variational bound formulation, yielding an estimator that extends beyond the conventional variational bound. It naturally subsumes the importance-weighted and Renyi bounds as special cases, and it is provably sharper than these counterparts. We also present an improved estimator for variational learning, and advocate a novel high signal-to-variance ratio update rule for the variational parameters. We discuss model-selection issues associated with existing evidence-lower-bound-based variational inference procedures, and show how to leverage the flexibility of our new formulation to address them. Empirical evidence is provided to validate our claims. |
Year | Venue | Field |
---|---|---|
2018 | ICML | Computer science,Inference,Model selection,Artificial intelligence,Machine learning |
DocType | Citations | PageRank |
Conference | 2 | 0.37 |
References | Authors | |
0 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Liqun Chen | 1 | 28 | 4.77 |
Chenyang Tao | 2 | 8 | 7.93 |
Ruiyi Zhang | 3 | 21 | 10.04 |
Ricardo Henao | 4 | 286 | 23.85 |
L. Carin | 5 | 4603 | 339.36 |