Title
Summarizing Articles into Sentences by Hierarchical Attention Model and RNN Language Model
Abstract
Tremendous amount of articles appear in various language everyday in nowadays big data era. To highlight articles automatically, an artificial neural network method is proposed in this paper. The proposed system is a kind of hierarchical attention model, which is composed by word attention model and sentence attention model with Long-Short Term Memory (LSTM) blocks, and Recurrent Neural Network Languages Model (RNNLM). Different from the conventional summarization methods using attention models which summarize single or multiple sentences to one short sentence, the proposed method is able to deal with multiple articles as input and output multiple short sentences.
Year
DOI
Venue
2019
10.1109/CISP-BMEI48845.2019.8965919
CISP-BMEI
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Takashi Kuremoto119627.73
Takuji Tsuruda200.34
Shingo Mabu349377.00