Title
Abstractive Text Summarization with Hierarchical Multi-scale Abstraction Modeling and Dynamic Memory
Abstract
ABSTRACTIn this paper, we propose a novel abstractive text summarization method with hierarchical multi-scale abstraction modeling and dynamic memory (called MADY). First, we propose a hierarchical multi-scale abstraction modeling method to capture the temporal dependencies of the document from multiple hierarchical levels of abstraction, which mimics the process of how human beings comprehend an article by learning fine timescales for low-level abstraction layers and coarse timescales for high-level abstraction layers. By applying this adaptive updating mechanism, the high-level abstraction layers are updated less frequently and expected to remember the long-term dependency better than the low-level abstraction layer. Second, we propose a dynamic key-value memory-augmented attention network to keep track of the attention history and comprehensive context information for the salient facets within the input document. In this way, our model can avoid generating repetitive words and faultiness summaries. Extensive experiments on two widely-used datasets demonstrate the effectiveness of the proposed MADY model in terms of both automatic evaluation and human evaluation. For reproducibility, we submit the code and data at: https://github.com/siat-nlp/MADY.git.
Year
DOI
Venue
2021
10.1145/3404835.3462998
Research and Development in Information Retrieval
Keywords
DocType
Citations 
Abstractive text summarization, multi-scale abstraction modeling, dynamic memory network
Conference
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Lihan Wang100.34
Min Yang27720.41
Chengming Li36310.60
Shen Ying47323.48
Xu Ruifeng543253.04