Title
Tensor Product Generation Networks for Deep NLP Modeling.
Abstract
We present a new approach to the design of deep networks for natural language processing (NLP), based on the general technique of Tensor Product Representations (TPRs) for encoding and processing symbol structures in distributed neural networks. A network architecture --- the Tensor Product Generation Network (TPGN) --- is proposed which is capable in principle of carrying out TPR computation, but which uses unconstrained deep learning to design its internal representations. Instantiated in a model for image-caption generation, TPGN outperforms LSTM baselines when evaluated on the COCO dataset. The TPR-capable structure enables interpretation of internal representations and operations, which prove to contain considerable grammatical content. Our caption-generation model can be interpreted as generating sequences of grammatical categories and retrieving words by their categories from a plan encoded as a distributed representation.
Year
DOI
Venue
2018
10.18653/v1/N18-1114
NAACL-HLT
Field
DocType
Volume
Tensor product,Grammatical category,Computer science,Network architecture,Artificial intelligence,Natural language processing,Deep learning,Artificial neural network,Distributed representation,Encoding (memory),Computation
Conference
1
Citations 
PageRank 
References 
2
0.35
14
Authors
5
Name
Order
Citations
PageRank
Qiuyuan Huang117617.66
Paul Smolensky221593.76
Xiaodong He33858190.28
Deng, Li49691728.14
Dapeng Oliver Wu541.43