Title
A Deep Neural Architecture For Sentence Semantic Matching
Abstract
Sentence semantic matching (SSM) is a fundamental research task in natural language processing. Most existing SSM methods take the advantage of sentence representation learning to generate a single or multi-granularity semantic representation for sentence matching. However, sentence interactions and loss function which are the two key factors for SSM still have not been fully considered. Accordingly, we propose a deep neural network architecture for SSM task with a sentence interactive matching layer and an optimised loss function. Given two input sentences, our model first encodes them to embeddings with an ordinary long short-term memory (LSTM) encoder. Then, the encoded embeddings are handled by an attention layer to find the key and important words in the sentences. Next, sentence interactions are captured with a matching layer to output a matching vector. Finally, based on the matching vector, a fully connected multi-layer perceptron outputs the similarity score. The model also distinguishes the equivocation training instances with an improved optimised loss function. We also systematically evaluate our model on a public Chinese semantic matching corpus, BQ corpus. The results demonstrate that our model outperforms the state-of-the-art methods, i.e., BiMPM, DIIN.
Year
DOI
Venue
2020
10.1504/IJCSE.2020.106870
INTERNATIONAL JOURNAL OF COMPUTATIONAL SCIENCE AND ENGINEERING
Keywords
DocType
Volume
sentence matching, representation learning, sentence interaction, loss function, deep neural model, long short-term memory, LSTM
Journal
21
Issue
ISSN
Citations 
4
1742-7185
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Xu Zhang172.86
Wenpeng Lu200.34
Fangfang Li361.49
Ruoyu Zhang410.69
Jinyong Cheng500.34