Abstract | ||
---|---|---|
A new form of variational autoencoder (VAE) is developed, in which the joint distribution of data and codes is considered in two (symmetric) forms: (i) from observed data fed through the encoder to yield codes, and (ii) from latent codes drawn from a simple prior and propagated through the decoder to manifest data. Lower bounds are learned for marginal log-likelihood fits observed data and latent codes. When learning with the variational bound, one seeks to minimize the symmetric Kullback-Leibler divergence of joint density functions from (i) and (ii), while simultaneously seeking to maximize the two marginal log-likelihoods. To facilitate learning, a new form of adversarial training is developed. An extensive set of experiments is performed, in which we demonstrate state-of-the-art data reconstruction and generation on several image benchmark datasets. |
Year | Venue | DocType |
---|---|---|
2017 | ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017) | Journal |
Volume | ISSN | Citations |
30 | 1049-5258 | 12 |
PageRank | References | Authors |
0.54 | 28 | 7 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yunchen Pu | 1 | 88 | 8.55 |
Weiyao Wang | 2 | 30 | 2.52 |
Ricardo Henao | 3 | 286 | 23.85 |
Liqun Chen | 4 | 28 | 4.77 |
Zhe Gan | 5 | 319 | 32.58 |
Chunyuan Li | 6 | 467 | 33.86 |
Lawrence Carin | 7 | 137 | 11.38 |