Title | ||
---|---|---|
Learning Distributed Representations of Symbolic Structure Using Binding and Unbinding Operations. |
Abstract | ||
---|---|---|
Widely used recurrent units, including Long-short Term Memory (LSTM) and Gated Recurrent Unit (GRU), perform well on natural language tasks, but their ability to learn structured representations is still questionable. Exploiting Tensor Product Representations (TPRs) --- distributed representations of symbolic structure in which vector-embedded symbols are bound to vector-embedded structural positions --- we propose the TPRU, a recurrent unit that, at each time step, explicitly executes structural-role binding and unbinding operations to incorporate structural information into learning. Experiments are conducted on both the Logical Entailment task and the Multi-genre Natural Language Inference (MNLI) task, and our TPR-derived recurrent unit provides strong performance with significantly fewer parameters than LSTM and GRU baselines. Furthermore, our learnt TPRU trained on MNLI demonstrates solid generalisation ability on downstream tasks. |
Year | Venue | DocType |
---|---|---|
2018 | arXiv: Neural and Evolutionary Computing | Journal |
Volume | Citations | PageRank |
abs/1810.12456 | 1 | 0.35 |
References | Authors | |
38 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shuai Tang | 1 | 6 | 2.86 |
Paul Smolensky | 2 | 215 | 93.76 |
Virginia R. de Sa | 3 | 31 | 8.29 |