Title
End-Task Oriented Textual Entailment via Deep Explorations of Inter-Sentence Interactions.
Abstract
This work deals with SciTail, a natural entailment challenge derived from a multi-choice question answering problem. The premises and hypotheses in SciTail were generated with no awareness of each other, and did not specifically aim at the entailment task. This makes it more challenging than other entailment data sets and more directly useful to the end-task -- question answering. We propose DEISTE (deep explorations of inter-sentence interactions for textual entailment) for this entailment task. Given word-to-word interactions between the premise-hypothesis pair ($P$, $H$), DEISTE consists of: (i) a parameter-dynamic convolution to make important words in $P$ and $H$ play a dominant role in learnt representations; and (ii) a position-aware attentive convolution to encode the representation and position information of the aligned word pairs. Experiments show that DEISTE gets $approx$5% improvement over prior state of the art and that the pretrained DEISTE on SciTail generalizes well on RTE-5.
Year
DOI
Venue
2018
10.18653/v1/p18-2086
meeting of the association for computational linguistics
DocType
Volume
Citations 
Conference
abs/1804.08813
4
PageRank 
References 
Authors
0.38
16
3
Name
Order
Citations
PageRank
Wenpeng Yin138723.87
Dan Roth27735695.19
Hinrich Schütze32113362.21