Title
Controlled Text Generation Using Dictionary Prior in Variational Autoencoders
Abstract
While variational autoencoders (VAEs) have been widely applied in text generation tasks, they are troubled by two challenges: insufficient representation capacity and poor controllability. The former results from the posterior collapse and restrictive assumption, which impede better representation learning. The latter arises as continuous latent variables in traditional formulations hinder VAEs from interpretability and controllability. In this paper, we propose Dictionary Prior (DPrior), a new data-driven prior that enjoys the merits of expressivity and controllability. To facilitate controlled text generation with DPrior, we propose to employ contrastive learning to separate the latent space into several parts. Extensive experiments on both language modeling and controlled text generation demonstrate the effectiveness of the proposed approach.
Year
DOI
Venue
2022
10.18653/v1/2022.findings-acl.10
FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022)
DocType
Volume
Citations 
Conference
Findings of the Association for Computational Linguistics: ACL 2022
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Xianghong Fang100.34
Jian Li232.74
Lifeng Shang348530.96
Xin Jiang415032.43
Qun Liu52149203.11
Dit-Yan Yeung600.34