Title
Variational inference with Gaussian mixture model and householder flow.
Abstract
The variational auto-encoder (VAE) is a powerful and scalable deep generative model. Under the architecture of VAE, the choice of the approximate posterior distribution is one of the crucial issues, and it has a significant impact on tractability and flexibility of the VAE. Generally, latent variables are assumed to be normally distributed with a diagonal covariance matrix, however, it is not flexible enough to match the true complex posterior distribution. We introduce a novel approach to design a flexible and arbitrarily complex approximate posterior distribution. Unlike VAE, firstly, an initial density is constructed by a Gaussian mixture model, and each component has a diagonal covariance matrix. Then this relatively simple distribution is transformed into a more flexible one by applying a sequence of invertible Householder transformations until the desired complexity has been achieved. Additionally, we also give a detailed theoretical and geometric interpretation of Householder transformations. Lastly, due to this change of approximate posterior distribution, the Kullback–Leibler distance between two mixture densities is required to be calculated, but it has no closed form solution. Therefore, we redefine a new variational lower bound by virtue of its upper bound. Compared with other generative models based on similar VAE architecture, our method achieves new state-of-the-art results on benchmark datasets including MNIST, Fashion-MNIST, Omniglot and Histopathology data a more challenging medical images dataset, the experimental results show that our method can improve the flexibility of posterior distribution more effectively.
Year
DOI
Venue
2019
10.1016/j.neunet.2018.10.002
Neural Networks
Keywords
Field
DocType
Variational auto-encoder,Gaussian mixture model,Householder flow,Variational inference
Mathematical optimization,MNIST database,Upper and lower bounds,Algorithm,Closed-form expression,Posterior probability,Latent variable,Invertible matrix,Mixture model,Mathematics,Generative model
Journal
Volume
Issue
ISSN
109
1
0893-6080
Citations 
PageRank 
References 
0
0.34
8
Authors
5
Name
Order
Citations
PageRank
GuoJun Liu1134.63
Yang Li2659125.00
Mao-Zu Guo352653.96
Peng Li427569.71
Mingyu Li5208.37