Title
Neural Tree Indexers for Text Understanding.
Abstract
Recurrent neural networks (RNNs) process input text sequentially and model the conditional transition between word tokens. In contrast, the advantages of recursive networks include that they explicitly model the compositionality and the recursive structure of natural language. However, the current recursive architecture is limited by its dependence on syntactic tree. In this paper, we introduce a robust syntactic parsing-independent tree structured model, Neural Tree Indexers (NTI) that provides a middle ground between the sequential RNNs and the syntactic tree-based recursive models. NTI constructs a by processing the input text with its node function in a bottom-up fashion. Attention mechanism can then be applied to both structure and node function. We implemented and evaluated a binary-tree model of NTI, showing the model achieved the state-of-the-art performance on three different NLP tasks: natural language inference, answer sentence selection, and sentence classification, outperforming state-of-the-art recurrent and recursive neural networks .
Year
DOI
Venue
2016
10.18653/v1/e17-1002
conference of the european chapter of the association for computational linguistics
Field
DocType
Volume
Principle of compositionality,Computer science,Recursive language,Natural language,Artificial intelligence,Natural language processing,Artificial neural network,Syntax,Sentence,Recursion,Machine learning,Limiting
Journal
abs/1607.04492
Citations 
PageRank 
References 
24
1.23
31
Authors
2
Name
Order
Citations
PageRank
Tsendsuren Munkhdalai116913.49
Hong Yu2374.90