Title
Plausibility-promoting Generative Adversarial Network for Abstractive Text Summarization with Multi-task Constraint
Abstract
Abstractive text summarization is an essential task in natural language processing, which aims to generate concise and condensed summaries retaining the salient information of the input document. Despite the progress of previous work, generating summaries, which are informative, grammatically correct and diverse, remains challenging in practice. In this paper, we present a Plausibility-promoting Generative Adversarial Network for Abstractive Text Summarization with Multi-Task constraint (PGAN-ATSMT), which shows promising performance for generating informative, grammatically correct, and novel summaries. First, PGAN-ATSMT adopts a plausibility-promoting generative adversarial network, which jointly trains a discriminative model D and a generative model G via adversarial learning. The generative model G employs the sequence-to-sequence architecture as its backbone, taking as input the original text and generating a corresponding summary. A novel language model based discriminator D is proposed to distinguish the generated summaries by G from the ground truth summaries without the saturation issue in the previous binary classifier discriminator. The generative model G and the discriminative model D are learned with a minimax two-player game, thus this adversarial process can eventually adjust G to produce high-quality and plausible summaries. Second, we propose two extended regularizations for the generative model G using the multi-task learning, sharing its LSTM encoder and LSTM decoder with text categorization task and syntax annotation task, respectively. The auxiliary tasks help to improve the quality of locating salient information of a document and generate high-quality summaries from language modeling perspective alleviating the issues of incomplete sentences and duplicated words. Experimental results on two benchmark datasets illustrate that PGAN-ATSMT achieves better performance than the state-of-the-art baseline methods in terms of both quantitative and qualitative evaluations.
Year
DOI
Venue
2020
10.1016/j.ins.2020.02.040
Information Sciences
Keywords
DocType
Volume
Abstractive text summarization,Generative adversarial network,Multi-task learning
Journal
521
Issue
ISSN
Citations 
C
0020-0255
2
PageRank 
References 
Authors
0.37
27
6
Name
Order
Citations
PageRank
Min Yang17720.41
Xintong Wang220.37
Yao Lu340.74
Jianming Lv485.39
Ying Shen555.49
Chengming Li641.86