Abstract | ||
---|---|---|
In classic papers, Zellner demonstrated that Bayesian inference could be derived as the solution to an information theoretic functional. Below we derive a generalized form of this functional as a variational lower bound of a predictive information bottleneck objective. This generalized functional encompasses most modern inference procedures and suggests novel ones. |
Year | Venue | DocType |
---|---|---|
2019 | AABI | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Alexander A. Alemi | 1 | 70 | 9.92 |