Title
Adaptive Multi-Compositionality for Recursive Neural Network Models.
Abstract
Recursive neural network models have achieved promising results in many natural language processing tasks. The main difference among these models lies in the composition function, i.e., how to obtain the vector representation for a phrase or sentence using the representations of words it contains. This paper introduces a novel Adaptive Multi-Compositionality (AdaMC) layer to recursive neural network models. The basic idea is to use more than one composition function and adaptively select them depending on input vectors. We develop a general framework to model the semantic composition as a distribution of these composition functions. The composition functions and parameters used for adaptive selection are jointly learnt from the supervision of specific tasks. We integrate AdaMC into existing recursive neural network models and conduct extensive experiments on the Stanford Sentiment Treebank and semantic relation classification task. The experimental results demonstrate that AdaMC improves the performance of recursive neural network models and outperforms the baseline methods.
Year
DOI
Venue
2016
10.1109/TASLP.2015.2509257
IEEE/ACM Trans. Audio, Speech & Language Processing
Keywords
Field
DocType
Semantics,Neural networks,Adaptation models,Computational modeling,Syntactics,Tensile stress,Adaptive systems
Computer science,Recurrent neural network,Phrase,Time delay neural network,Artificial intelligence,Artificial neural network,Principle of compositionality,Nervous system network models,Pattern recognition,Probabilistic neural network,Speech recognition,Treebank,Machine learning
Journal
Volume
Issue
ISSN
24
3
2329-9290
Citations 
PageRank 
References 
4
0.41
20
Authors
5
Name
Order
Citations
PageRank
Li Dong158231.86
Furu Wei21956107.57
Ke Xu3143399.79
Shixia Liu4209582.41
Ming Zhou54262251.74