Title
A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents.
Abstract
Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models.
Year
DOI
Venue
2018
10.18653/v1/N18-2097
north american chapter of the association for computational linguistics
DocType
Volume
Citations 
Journal
abs/1804.05685
10
PageRank 
References 
Authors
0.60
15
7
Name
Order
Citations
PageRank
Arman Cohan113918.25
Franck Dernoncourt214935.39
Doo Soon Kim3122.05
Trung H. Bui48621.88
Seok-Hwan Kim516523.82
Walter Chang6251159.67
Nazli Goharian746049.93