Title
DVAE++: Discrete Variational Autoencoders with Overlapping Transformations.
Abstract
Training of discrete latent variable models remains challenging because passing gradient information through discrete units is difficult. We propose a new class of smoothing transformations based on a mixture of two overlapping distributions, and show that the proposed transformation can be used for training binary latent models with either directed or undirected priors. We derive a new variational bound to efficiently train with Boltzmann machine priors. Using this bound, we develop DVAE++, a generative model with a global discrete prior and a hierarchy of convolutional continuous variables. Experiments on several benchmarks show that overlapping transformations outperform other recent continuous relaxations of discrete latent variables including Gumbel-Softmax (Maddison et al., 2016; Jang et al., 2016), and discrete variational autoencoders (Rolfe 2016).
Year
Venue
DocType
2018
ICML
Conference
Volume
Citations 
PageRank 
abs/1802.04920
5
0.43
References 
Authors
18
5
Name
Order
Citations
PageRank
Vahdat, Arash135318.20
William G. Macready216139.07
Zhengbing Bian3202.20
Amir Khoshaman4131.39
Evgeny Andriyash550.43