Title | ||
---|---|---|
Multi-Step Deductive Reasoning Over Natural Language: An Empirical Study on Out-of-Distribution Generalisation. |
Abstract | ||
---|---|---|
Combining deep learning with symbolic logic reasoning aims to capitalize on the success of both fields and is drawing increasing attention. Inspired by DeepLogic, an end-to-end model trained to perform inference on logic programs, we introduce IMA-GloVe-GA, an iterative neural inference network for multi-step reasoning expressed in natural language. In our model, reasoning is performed using an iterative memory neural network based on RNN with a gate attention mechanism. We evaluate IMA-GloVe-GA on three datasets: PARARULES, CONCEPTRULES V1 and CONCEPTRULES V2. Experimental results show DeepLogic with gate attention can achieve higher test accuracy than DeepLogic and other RNN baseline models. Our model achieves better out-of-distribution generalisation than RoBERTa-Large when the rules have been shuffled. Furthermore, to address the issue of unbalanced distribution of reasoning depths in the current multi-step reasoning datasets, we develop PARARULE-Plus, a large dataset with more examples that require deeper reasoning steps. Experimental results show that the addition of PARARULE-Plus can increase the model's performance on examples requiring deeper reasoning depths. The source code and data are available at https://github.com/Strong-AI-Lab/Multi-Step-Deductive-Reasoning-Over-Natural-Language. |
Year | Venue | DocType |
---|---|---|
2022 | International Workshop on Neural-Symbolic Learning and Reasoning (NeSy) | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
7 |
Name | Order | Citations | PageRank |
---|---|---|---|
Qiming Bao | 1 | 0 | 1.01 |
Alex Yuxuan Peng | 2 | 0 | 0.34 |
Tim Hartill | 3 | 0 | 0.68 |
Neset Tan | 4 | 0 | 0.34 |
Zhenyun Deng | 5 | 0 | 1.01 |
Michael Witbrock | 6 | 0 | 1.69 |
Jiamou Liu | 7 | 49 | 23.19 |