Title
Generative Adversarial Networks as Variational Training of Energy Based Models.
Abstract
In this paper, we study deep generative models for effective unsupervised learning. We propose VGAN, which works by minimizing a variational lower bound of the negative log likelihood (NLL) of an energy based model (EBM), where the model density $p(mathbf{x})$ is approximated by a variational distribution $q(mathbf{x})$ that is easy to sample from. The training of VGAN takes a two step procedure: given $p(mathbf{x})$, $q(mathbf{x})$ is updated to maximize the lower bound; $p(mathbf{x})$ is then updated one step with samples drawn from $q(mathbf{x})$ to decrease the lower bound. VGAN is inspired by the generative adversarial networks (GANs), where $p(mathbf{x})$ corresponds to the discriminator and $q(mathbf{x})$ corresponds to the generator, but with several notable differences. We hence name our model variational GANs (VGANs). VGAN provides a practical solution to training deep EBMs in high dimensional space, by eliminating the need of MCMC sampling. From this view, we are also able to identify causes to the difficulty of training GANs and propose viable solutions.
Year
Venue
Field
2016
arXiv: Learning
Mathematical optimization,Discriminator,Markov chain Monte Carlo,Upper and lower bounds,Unsupervised learning,High dimensional space,Generative grammar,Mathematics,Negative log likelihood
DocType
Volume
Citations 
Journal
abs/1611.01799
1
PageRank 
References 
Authors
0.35
0
4
Name
Order
Citations
PageRank
shuangfei zhai19910.00
Yu Cheng261555.76
Rogério Feris3152989.95
Zhongfei (Mark) Zhang42451164.30