Title
Neural Text Generation: Past, Present and Beyond.
Abstract
This paper presents a systematic survey on recent development of neural text generation models. Specifically, we start from recurrent neural network language models with the traditional maximum likelihood estimation training scheme and point out its shortcoming for text generation. We thus introduce the recently proposed methods for text generation based on reinforcement learning, re-parametrization tricks and generative adversarial nets (GAN) techniques. We compare different properties of these models and the corresponding techniques to handle their common problems such as gradient vanishing and generation diversity. Finally, we conduct a benchmarking experiment with different types of neural text generation models on two well-known datasets and discuss the empirical results along with the aforementioned model properties.
Year
Venue
Field
2018
arXiv: Computation and Language
Text generation,Recurrent neural network language models,Computer science,Maximum likelihood,Artificial intelligence,Generative grammar,Machine learning,Benchmarking,Reinforcement learning
DocType
Volume
Citations 
Journal
abs/1803.07133
6
PageRank 
References 
Authors
0.46
25
5
Name
Order
Citations
PageRank
Sidi Lu1573.24
Yaoming Zhu263.16
Weinan Zhang3122897.24
Jun Wang42514138.37
Yong Yu57637380.66