Abstract | ||
---|---|---|
In this paper, we present a graph-based Transformer for semantic parsing. We separate the semantic parsing task into two steps: 1) Use a sequence-to-sequence model to generate the logical form candidates. 2) Design a graph-based Transformer to rerank the candidates. To handle the structure of logical forms, we incorporate graph information to Transformer, and design a cross-candidate verification mechanism to consider all the candidates in the ranking process. Furthermore, we integrate BERT into our model and jointly train the graph-based Transformer and BERT. We conduct experiments on 3 semantic parsing benchmarks, ATIS, JOBS and Task Oriented semantic Parsing dataset (TOP). Experiments show that our graph-based reranking model achieves results comparable to state-of-the-art models on the ATIS and JOBS datasets. And on the TOP dataset, our model achieves a new state-of-the-art result. |
Year | Venue | DocType |
---|---|---|
2020 | national conference on artificial intelligence | Conference |
Volume | ISSN | Citations |
34 | 2159-5399 | 0 |
PageRank | References | Authors |
0.34 | 0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Bo Shao | 1 | 2 | 4.13 |
Yeyun Gong | 2 | 94 | 16.67 |
Weizhen Qi | 3 | 0 | 3.04 |
Guihong Cao | 4 | 0 | 0.68 |
Jianshu Ji | 5 | 16 | 2.29 |
Xiaola Lin | 6 | 1099 | 78.09 |