Title
Teaching Machines to Read, Answer and Explain
Abstract
With various Pre-trained Language Models (PLMs) blooming, Machine Reading Comprehension (MRC) systems have embraced significant improvements on various benchmarks and even surpassed human performances. However, most existing works only focus on the accuracy of the answer predictions and neglect the importance of the explanations for the prediction, which is a big obstacle when utilizing these models in real-life applications to convince humans. This paper proposes a novel unsupervised self-explainable framework, called Recursive Dynamic Gating (RDG), for the machine reading comprehension task. The main idea is that the proposed system tries to use less passage information and achieves similar results to the system that uses the whole passage, while the filtered passage is used as text explanations. We carried out experiments on three multiple-choice MRC datasets (including English and Chinese) and found that the proposed system can not only achieve better performance in answer prediction but also provide informative explanations compared to the attention mechanism.
Year
DOI
Venue
2022
10.1109/TASLP.2022.3156789
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING
Keywords
DocType
Volume
Task analysis, Transformers, Predictive models, Annotations, Speech processing, Information filters, Benchmark testing, Machine reading comprehension, question answering, explainable artificial intelligence
Journal
30
Issue
ISSN
Citations 
1
2329-9290
0
PageRank 
References 
Authors
0.34
6
5
Name
Order
Citations
PageRank
Yiming Cui18713.40
Ting Liu22735232.31
Wanxiang Che371166.39
Zhigang Chen420434.10
Shijin Wang518031.56