Title
Preventing Posterior Collapse with delta-VAEs.
Abstract
Due to the phenomenon of collapse, current latent variable generative models pose a challenging design choice that either weakens the capacity of the decoder or requires augmenting the objective so it does not only maximize the likelihood of the data. In this paper, we propose an alternative that utilizes the most powerful generative models as decoders, whilst optimising the variational lower bound all while ensuring that the latent variables preserve and encode useful information. Our proposed $delta$-VAEs achieve this by constraining the variational family for the posterior to have a minimum distance to the prior. For sequential latent variable models, our approach resembles the classic representation learning approach of slow feature analysis. We demonstrate the efficacy of our approach at modeling text on LM1B and modeling images: learning representations, improving sample quality, and achieving state of the art log-likelihood on CIFAR-10 and ImageNet $32times 32$.
Year
Venue
Field
2019
ICLR
ENCODE,Upper and lower bounds,Design choice,Theoretical computer science,Latent variable,Artificial intelligence,Generative grammar,Phenomenon,Mathematics,Pattern recognition (psychology),Feature learning,Machine learning
DocType
Volume
Citations 
Journal
abs/1901.03416
0
PageRank 
References 
Authors
0.34
19
4
Name
Order
Citations
PageRank
Ali Razavi100.34
Aäron Van Den Oord2158564.43
Ben Poole355452.06
Oriol Vinyals49419418.45