Title
Hierarchical Importance Weighted Autoencoders.
Abstract
Importance weighted variational inference (Burda et al., 2015) uses multiple i.i.d. samples to have a tighter variational lower bound. We believe a joint proposal has the potential of reducing the number of redundant samples, and introduce a hierarchical structure to induce correlation. The hope is that the proposals would coordinate to make up for the error made by one another to reduce the variance of the importance estimator. Theoretically, we analyze the condition under which convergence of the estimator variance can be connected to convergence of the lower bound. Empirically, we confirm that maximization of the lower bound does implicitly minimize variance. Further analysis shows that this is a result of negative correlation induced by the proposed hierarchical meta sampling scheme, and performance of inference also improves when the number of samples increases.
Year
Venue
Field
2019
international conference on machine learning
Pattern recognition,Computer science,Artificial intelligence,Machine learning
DocType
Volume
Citations 
Journal
abs/1905.04866
1
PageRank 
References 
Authors
0.35
0
5
Name
Order
Citations
PageRank
Chin-Wei Huang185.18
Kris Sankaran210.68
Eeshan Dhekane310.68
Alexandre Lacoste414713.05
Aaron C. Courville56671348.46