Title
A Semi-Supervised Approach for Low-Resourced Text Generation.
Abstract
Recently, encoder-decoder neural models have achieved great success on text generation tasks. However, one problem of this kind of models is that their performances are usually limited by the scale of well-labeled data, which are very expensive to get. The low-resource (of labeled data) problem is quite common in different task generation tasks, but unlabeled data are usually abundant. In this paper, we propose a method to make use of the unlabeled data to improve the performance of such models in the low-resourced circumstances. We use denoising auto-encoder (DAE) and language model (LM) based reinforcement learning (RL) to enhance the training of encoder and decoder with unlabeled data. Our method shows adaptability for different text generation tasks, and makes significant improvements over basic text generation models.
Year
Venue
DocType
2019
arXiv: Computation and Language
Journal
Volume
Citations 
PageRank 
abs/1906.00584
0
0.34
References 
Authors
0
2
Name
Order
Citations
PageRank
Hongyu Zang111.03
Xiaojun Wan21685125.70