Title | ||
---|---|---|
Conversational Question Answering over Knowledge Graphs with Transformer and Graph Attention Networks |
Abstract | ||
---|---|---|
This paper addresses the task of (complex) conversational question answering over a knowledge graph. For this task, we propose LASAGNE (muLti-task semAntic parSing with trAnsformer and Graph atteNtion nEtworks). It is the first approach, which employs a transformer architecture extended with Graph Attention Networks for multi-task neural semantic parsing. LASAGNE uses a transformer model for generating the base logical forms, while the Graph Attention model is used to exploit correlations between (entity) types and predicates to produce node representations. LASAGNE also includes a novel entity recognition module which detects, links, and ranks all relevant entities in the question context. We evaluate LASAGNE on a standard dataset for complex sequential question answering, on which it outperforms existing baseline averages on all question types. Specifically, we show that LASAGNE improves the F1-score on eight out of ten question types; in some cases, the increase in F1-score is more than 20% compared to the state of the art. |
Year | Venue | DocType |
---|---|---|
2021 | EACL | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Endri Kacupaj | 1 | 5 | 2.18 |
Joan Plepi | 2 | 0 | 0.68 |
Kuldeep Singh | 3 | 0 | 0.68 |
Harsh Thakkar | 4 | 0 | 0.34 |
Jens Lehmann | 5 | 0 | 0.34 |
Maria Maleshkova | 6 | 5 | 1.50 |