Title
Learning Undirected Posteriors by Backpropagation through MCMC Updates.
Abstract
The representation of the posterior is a critical aspect of effective variational autoencoders (VAEs). Poor choices for the posterior have a detrimental impact on the generative performance of VAEs due to the mismatch with the true posterior. We extend the class of posterior models that may be learned by using undirected graphical models. We develop an efficient method to train undirected posteriors by showing that the gradient of the training objective with respect to the parameters of the undirected posterior can be computed by backpropagation through Markov chain Monte Carlo updates. We apply these gradient estimators for training discrete VAEs with Boltzmann machine posteriors and demonstrate that undirected models outperform previous results obtained using directed graphical models as posteriors.
Year
Venue
DocType
2019
arXiv: Machine Learning
Journal
Volume
Citations 
PageRank 
abs/1901.03440
0
0.34
References 
Authors
26
3
Name
Order
Citations
PageRank
Vahdat, Arash135318.20
Evgeny Andriyash2122.80
William G. Macready316139.07