Title
Fast Semantic Matching via Flexible Contextualized Interaction
Abstract
ABSTRACTDeep pre-trained language models (e.g., BERT) lead to remarkable headway in many Natural Language Processing tasks. Their superior capacity in perceiving textual data is also witnessed in semantic matching tasks (e.g., question answering, web search). Particularly for matching a pair of query and text candidate, the current state-of-the-arts usually rely on the semantic representations produced by BERT, and compute relevance scores with various interaction (i.e., matching) methods. However, they may 1) miss fine-grained phrase-level interaction between the input query and candidate context or 2) lack a thoughtful consideration of both effectiveness and efficiency. Motivated by this, we propose \hyttInteractor, a BERT-based semantic matching model with a flexible contextualized interaction paradigm. It is capable of capturing fine-grained phrase-level information in the interaction, and thus is more effective to be applied for semantic matching tasks. Moreover, we further facilitate \hyttInteractor with a novel partial attention scheme, which significantly reduces the computational cost while maintaining the high effectiveness. We conduct comprehensive experimental evaluations on three datasets. The results show that \hyttInteractor achieves superior effectiveness and efficiency for semantic matching.
Year
DOI
Venue
2022
10.1145/3488560.3498442
WSDM
Keywords
DocType
Citations 
Text Retrieval, Efficient Retrieval, BERT, Neural Network
Conference
1
PageRank 
References 
Authors
0.39
0
7
Name
Order
Citations
PageRank
Wenwen Ye111.40
Yiding Liu273.19
Lixin Zou3141.70
Hengyi Cai441.14
Suqi Cheng552.19
Shuaiqiang Wang625422.72
Dawei Yin786661.99