Title
Tweet-aware News Summarization with Dual-Attention Mechanism
Abstract
ABSTRACT Neural models have been applied to many text summarization tasks recently. In general, a large number of high quality reference summaries are required to train well-performing neural models. The reference summaries, i.e., ground truth, are usually written by human, and are costly to obtain. Thus, in this paper, we focus on unsupervised summarization problem by exploring news and readers’ comments in linking tweets, i.e., tweets with URLs linking to the news. Our data analysis shows that the linking tweets, collectively highlight important information in news but may not fully cover all content in news. This inspires us to propose the dual-attention based model, named DAS, to address the observed issues above. The dual-attention mechanism extracts both important information highlighted by linking tweets and the salient content in news. Specifically, it consists of two similar structures of Transformer with multi-head attention. We propose position-dependent word salience, which reflects the effect of local context. The word salience is computed from dual-attention mechanism. Sentence salience is then estimated from the word salience. Experimental results on a benchmark dataset show that DAS outperforms state-of-the-art unsupervised models and achieves comparable results with state-of-the-art supervised models.
Year
DOI
Venue
2021
10.1145/3442442.3452309
International World Wide Web Conference
Keywords
DocType
Citations 
Unsupervised summarization, dual-attention mechanism, tweets
Conference
0
PageRank 
References 
Authors
0.34
7
3
Name
Order
Citations
PageRank
Xin Zheng1192.33
Aixin Sun23071156.89
Karthik Muthuswamy300.34