Title
A Transformer-Based Hierarchical Variational AutoEncoder Combined Hidden Markov Model for Long Text Generation
Abstract
The Variational AutoEncoder (VAE) has made significant progress in text generation, but it focused on short text (always a sentence). Long texts consist of multiple sentences. There is a particular relationship between each sentence, especially between the latent variables that control the generation of the sentences. The relationships between these latent variables help in generating continuous and logically connected long texts. There exist very few studies on the relationships between these latent variables. We proposed a method for combining the Transformer-Based Hierarchical Variational AutoEncoder and Hidden Markov Model (HT-HVAE) to learn multiple hierarchical latent variables and their relationships. This application improves long text generation. We use a hierarchical Transformer encoder to encode the long texts in order to obtain better hierarchical information of the long text. HT-HVAE's generation network uses HMM to learn the relationship between latent variables. We also proposed a method for calculating the perplexity for the multiple hierarchical latent variable structure. The experimental results show that our model is more effective in the dataset with strong logic, alleviates the notorious posterior collapse problem, and generates more continuous and logically connected long text.
Year
DOI
Venue
2021
10.3390/e23101277
ENTROPY
Keywords
DocType
Volume
Variational AutoEncoder, text generation, Hidden Markov Model, Transformer, latent variables
Journal
23
Issue
ISSN
Citations 
10
1099-4300
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Kun Zhao1156.04
H. Ding201.69
Kai Ye301.69
Xiaohui Cui400.34