Title
DUMA: Reading Comprehension With Transposition Thinking
Abstract
AbstractMulti-choice Machine Reading Comprehension (MRC) requires models to decide the correct answer from a set of answer options when given a passage and a question. Thus, in addition to a powerful Pre-trained Language Model (PrLM) as an encoder, multi-choice MRC especially relies on a matching network design that is supposed to effectively capture the relationships among the triplet of passage, question, and answers. While the newer and more powerful PrLMs have shown their strengths even without the support from a matching network, we propose a new DUal Multi-head Co-Attention (DUMA) model. It is inspired by the human transposition thinking process solving the multi-choice MRC problem by considering each other’s focus from the standpoint of passage and question. The proposed DUMA has been shown to be effective and is capable of generally promoting PrLMs. Our proposed method is evaluated on two benchmark multi-choice MRC tasks, DREAM, and RACE. Our results show that in terms of powerful PrLMs, DUMA can further boost the models to obtain higher performance.
Year
DOI
Venue
2022
10.1109/TASLP.2021.3138683
IEEE/ACM Transactions on Audio, Speech and Language Processing
Keywords
DocType
Volume
Task analysis, Training, Transformers, Speech processing, Bit error rate, Bidirectional control, Context modeling, Attention network, machine reading comprehension, pre-trained language model
Journal
10.5555
Issue
ISSN
Citations 
taslp.2022.issue-30
2329-9290
1
PageRank 
References 
Authors
0.39
7
3
Name
Order
Citations
PageRank
Pengfei Zhu124931.05
Hai Zhao2960113.64
Li Xiaoguang310.39