Title
Gated Graph Neural Attention Networks for abstractive summarization
Abstract
Sequence to sequence (Seq2Seq) model for abstractive summarization have aroused widely attention due to their powerful ability to represent sequence. However, the sequence structured data is a simple format, which cannot describe the complexity of graphs and may lead to ambiguous, and hurt the performance of summarization. In this paper, we propose a Gated Graph Neural Attention Networks (GGNANs) for abstractive summarization. The proposed GGNANs unified graph neural network and the celebrated Seq2seq for better encoding the full graph-structured information. We propose a graph transform method based on PMI, self-connection, forward-connection and backward-connection to better combine graph-structured information and the sequence-structured information. Extensive experimental results on the LCSTS and Gigaword show that our proposed model outperforms most of strong baseline models.
Year
DOI
Venue
2021
10.1016/j.neucom.2020.09.066
Neurocomputing
Keywords
DocType
Volume
Seq2Seq,Abstractive summarization,Gated Graph Neural Attention Networks
Journal
431
ISSN
Citations 
PageRank 
0925-2312
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Zeyu Liang100.68
Junping Du278991.80
Yingxia Shao321324.25
Houye Ji452.46