Title
Fixing a Broken ELBO.
Abstract
Recent work in unsupervised representation learning has focused on learning deep directed latent-variable models. Fitting these models by maximizing the marginal likelihood or evidence is typically intractable, thus a common approximation is to maximize the evidence lower bound (ELBO) instead. However, maximum likelihood training (whether exact or approximate) does not necessarily result in a good latent representation, as we demonstrate both theoretically and empirically. In particular, we derive variational lower and upper bounds on the mutual information between the input and the latent variable, and use these bounds to derive a rate-distortion curve that characterizes the tradeoff between compression and reconstruction accuracy. Using this framework, we demonstrate that there is a family of models with identical ELBO, but different quantitative and qualitative characteristics. Our framework also suggests a simple new method to ensure that latent variable models with powerful stochastic decoders do not ignore their latent code.
Year
Venue
Field
2018
ICML
Mathematical optimization,Upper and lower bounds,Marginal likelihood,Maximum likelihood,Latent variable,Mutual information,Feature learning,Mathematics
DocType
Citations 
PageRank 
Conference
20
0.69
References 
Authors
3
6
Name
Order
Citations
PageRank
Alexander A. Alemi1709.92
Ben Poole255452.06
Ian Fischer342226.82
Joshua V. Dillon4503.85
Rif Saurous514810.49
Michael Kuperberg67589529.66