Title | ||
---|---|---|
Bootstrapping incremental dialogue systems from minimal data: the generalisation power of dialogue grammars. |
Abstract | ||
---|---|---|
We investigate an end-to-end method for automatically inducing task-based dialogue systems from small amounts of unannotated dialogue data. It combines an incremental semantic grammar - Dynamic Syntax and Type Theory with Records (DS-TTR) - with Reinforcement Learning (RL), where language generation and dialogue management are a joint decision problem. The systems thus produced are incremental: dialogues are processed word-by-word, shown previously to be essential in supporting natural, spontaneous dialogue. We hypothesised that the rich linguistic knowledge within the grammar should enable a combinatorially large number of dialogue variations to be processed, even when trained on very few dialogues. Our experiments show that our model can process 74% of the Facebook AI bAbI dataset even when trained on only 0.13% of the data (5 dialogues). It can in addition process 65% of bAbI+, a corpus we created by systematically adding incremental dialogue phenomena such as restarts and self-corrections to bAbI. We compare our model with a state-of-the-art retrieval model, MemN2N. We find that, in terms of semantic accuracy, MemN2N shows very poor robustness to the bAbI+ transformations even when trained on the full bAbI dataset. |
Year | DOI | Venue |
---|---|---|
2017 | 10.18653/v1/d17-1236 | empirical methods in natural language processing |
DocType | Volume | ISSN |
Journal | abs/1709.07858 | Proceedings of the 2017 Conference on Empirical Methods in Natural
Language Processing (ISBN 978-1-945626-83-8), pp 2210-2220. Copenhagen,
Denmark September 7-11, 2017 |
Citations | PageRank | References |
2 | 0.37 | 7 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Arash Eshghi | 1 | 18 | 7.61 |
Igor Shalyminov | 2 | 9 | 3.30 |
Oliver Lemon | 3 | 135 | 14.94 |