Title
Attention-Based Memory Network for Sentence-Level Question Answering.
Abstract
Sentence-level question answering (QA) for news articles is a promising task for social media, whose task is to make machine understand a news article and answer a corresponding question with an answer sentence selected from the news article. Recently, several deep neural networks have been proposed for sentence-level QA. For the best of our knowledge, none of them explicitly use keywords that appear simultaneously in questions and documents. In this paper we introduce the Attention-based Memory Network (Att-MemNN), a new iterative bi-directional attention memory network that predicts answer sentences. It exploits the co-occurrence of keywords among questions and documents as augment inputs of deep neural network and embeds documents and corresponding questions in different way, processing questions with word-level and contextual-level embedding while processing documents only with word-level embedding. Experimental results on the test set of NewsQA show that our model yields great improvement. We also use quantitative and qualitative analysis to show the results intuitively.
Year
DOI
Venue
2017
10.1007/978-981-10-6805-8_9
Communications in Computer and Information Science
Keywords
DocType
Volume
Sentence-level question answering for news articles,Attention mechanism,Memory network,Deep learning
Conference
774
ISSN
Citations 
PageRank 
1865-0929
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Pei Liu144.47
Chunhong Zhang29320.35
Weiming Zhang38315.80
Zhiqiang Zhan482.12
Benhui Zhuang500.34