Title
Topic-Guided Variational Autoencoders for Text Generation.
Abstract
We propose a topic-guided variational autoencoder (TGVAE) model for text generation. Distinct from existing variational autoencoder (VAE) based approaches, which assume a simple Gaussian prior for the latent code, our model specifies the prior as a Gaussian mixture model (GMM) parametrized by a neural topic module. Each mixture component corresponds to a latent topic, which provides guidance to generate sentences under the topic. The neural topic module and the VAE-based neural sequence module in our model are learned jointly. In particular, a sequence of invertible Householder transformations is applied to endow the approximate posterior of the latent code with high flexibility during model inference. Experimental results show that our TGVAE outperforms alternative approaches on both unconditional and conditional text generation, which can generate semantically-meaningful sentences with various topics.
Year
Venue
DocType
2019
north american chapter of the association for computational linguistics
Journal
Volume
Citations 
PageRank 
abs/1903.07137
1
0.35
References 
Authors
42
8
Name
Order
Citations
PageRank
Wenlin Wang1939.18
Zhe Gan231932.58
Hongteng Xu328227.10
Ruiyi Zhang42110.04
Guoyin Wang5247.38
Dinghan Shen610810.37
Changyou Chen736536.95
L. Carin84603339.36