Title
A Neural-Symbolic Approach to Natural Language Tasks.
Abstract
Deep learning (DL) has in recent years been widely used in natural language processing (NLP) applications due to its superior performance. However, while natural languages are rich in grammatical structure, DL has not been able to explicitly represent and enforce such structures. This paper proposes a new architecture to bridge this gap by exploiting tensor product representations (TPR), a structured neural-symbolic framework developed in cognitive science over the past 20 years, with the aim of integrating DL with explicit language structures and rules. We call it the Tensor Product Generation Network (TPGN), and apply it to image captioning. The key ideas of TPGN are: 1) unsupervised learning of role-unbinding vectors of words via a TPR-based deep neural network, and 2) integration of TPR with typical DL architectures including Long Short-Term Memory (LSTM) models. The novelty of our approach lies in its ability to generate a sentence and extract partial grammatical structure of the sentence by using role-unbinding vectors, which are obtained in an unsupervised manner. Experimental results demonstrate the effectiveness of the proposed approach.
Year
Venue
Field
2017
arXiv: Computation and Language
Temporal annotation,Computer science,Natural language programming,Natural language,Unsupervised learning,Natural language processing,Universal Networking Language,Language identification,Artificial intelligence,Deep learning,Artificial neural network
DocType
Volume
Citations 
Journal
abs/1710.11475
0
PageRank 
References 
Authors
0.34
10
5
Name
Order
Citations
PageRank
Qiuyuan Huang117617.66
Paul Smolensky221593.76
Xiaodong He33858190.28
Deng, Li49691728.14
Dapeng Oliver Wu541.43