Title | ||
---|---|---|
Multi-task Learning for Conversational Question Answering Over a Large-Scale Knowledge Base |
Abstract | ||
---|---|---|
We consider the problem of conversational question answering over a large-scale knowledge base. To handle huge entity vocabulary of a large-scale knowledge base, recent neural semantic parsing based approaches usually decompose the task into several subtasks and then solve them sequentially, which leads to following issues: 1) errors in earlier subtasks will be propagated and negatively affect downstream ones; and 2) each subtask cannot naturally share supervision signals with others. To tackle these issues, we propose an innovative multi-task learning framework where a pointer-equipped semantic parsing model is designed to resolve coreference in conversations, and naturally empower joint learning with a novel type-aware entity detection model. The proposed framework thus enables shared supervisions and alleviates the effect of error propagation. Experiments on a large-scale conversational question answering dataset containing 1.6M question answering pairs over 12.8M entities show that the proposed framework improves overall F1 score from 67% to 79% compared with previous state-of-the-art work. |
Year | DOI | Venue |
---|---|---|
2019 | 10.18653/v1/D19-1248 | EMNLP/IJCNLP (1) |
DocType | Volume | Citations |
Conference | D19-1 | 0 |
PageRank | References | Authors |
0.34 | 0 | 8 |
Name | Order | Citations | PageRank |
---|---|---|---|
Tao Shen | 1 | 4 | 2.09 |
Xiubo Geng | 2 | 250 | 16.43 |
Tao Qin | 3 | 2384 | 147.25 |
Daya Guo | 4 | 6 | 4.81 |
Duyu Tang | 5 | 883 | 36.98 |
Nan Duan | 6 | 213 | 45.87 |
Guodong Long | 7 | 655 | 47.27 |
Daxin Jiang | 8 | 1316 | 72.60 |