Title
Dependent Multilevel Interaction Network For Natural Language Inference
Abstract
Neural networks have attracted great attention for natural language inference in recent years. Interactions between the premise and the hypothesis have been proved to be effective in improving the representations. Existing methods mainly focused on a single interaction, while multiple interactions have not been well studied. In this paper, we propose a dependent multilevel interaction (DMI) Network which models multiple interactions between the premise and the hypothesis to boost the performance of natural language inference. In specific, a single-interaction unit (SIU) structure with a novel combining attention mechanism is presented to capture features in an interaction. Then, we cascade a serial of SIUs in a multilevel interaction layer to obtain more comprehensive features. Experiments on two benchmark datasets, namely SciTail and SNLI, show the effectiveness of our proposed model. Our model outperforms the state-of-the-art approaches on the SciTail dataset without using any external resources. For the SNLI dataset, our model also achieves competitive results.
Year
DOI
Venue
2019
10.1007/978-3-030-30490-4_2
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: TEXT AND TIME SERIES, PT IV
Keywords
DocType
Volume
Deep learning, Sentence interaction, Attention mechanism
Conference
11730
ISSN
Citations 
PageRank 
0302-9743
0
0.34
References 
Authors
0
7
Name
Order
Citations
PageRank
Yun Li100.34
Yan Yang200.68
Yong Deng300.34
Qinmin Vivian Hu4206.06
Chengcai Chen500.68
Liang He66120.38
Zhou Yu727839.88