Title
Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning
Abstract
This paper focuses on how to take advantage of external relational knowledge to improve machine reading comprehension (MRC) with multi-task learning. Most of the traditional methods in MRC assume that the knowledge used to get the correct answer generally exists in the given documents. However, in real-world task, part of knowledge may not be mentioned and machines should be equipped with the ability to leverage external knowledge. In this paper, we integrate relational knowledge into MRC model for commonsense reasoning. Specifically, based on a pre-trained language model (LM), We design two auxiliary relation-aware tasks to predict if there exists any commonsense relation and what is the relation type be-tween two words, in order to better model the interactions between document and candidate answer option. We conduct experiments on two multi-choice benchmark datasets: the SemEval-2018 Task11 and the Cloze Story Test. The experimental results demonstrate the effectiveness of the proposed method, which achieves superior performance compared with the comparable baselines on both datasets.
Year
DOI
Venue
2019
10.1145/3357384.3358165
Proceedings of the 28th ACM International Conference on Information and Knowledge Management
Keywords
DocType
ISBN
commonsense reasoning, machine reading comprehension, multi-task learning
Conference
978-1-4503-6976-3
Citations 
PageRank 
References 
3
0.46
10
Authors
3
Name
Order
Citations
PageRank
Jiangnan Xia171.90
Chen Wu2234.23
Ming Yan3998.39