Abstract | ||
---|---|---|
ABSTRACTDeep pre-trained language models (e.g., BERT) lead to remarkable headway in many Natural Language Processing tasks. Their superior capacity in perceiving textual data is also witnessed in semantic matching tasks (e.g., question answering, web search). Particularly for matching a pair of query and text candidate, the current state-of-the-arts usually rely on the semantic representations produced by BERT, and compute relevance scores with various interaction (i.e., matching) methods. However, they may 1) miss fine-grained phrase-level interaction between the input query and candidate context or 2) lack a thoughtful consideration of both effectiveness and efficiency. Motivated by this, we propose \hyttInteractor, a BERT-based semantic matching model with a flexible contextualized interaction paradigm. It is capable of capturing fine-grained phrase-level information in the interaction, and thus is more effective to be applied for semantic matching tasks. Moreover, we further facilitate \hyttInteractor with a novel partial attention scheme, which significantly reduces the computational cost while maintaining the high effectiveness. We conduct comprehensive experimental evaluations on three datasets. The results show that \hyttInteractor achieves superior effectiveness and efficiency for semantic matching. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1145/3488560.3498442 | WSDM |
Keywords | DocType | Citations |
Text Retrieval, Efficient Retrieval, BERT, Neural Network | Conference | 1 |
PageRank | References | Authors |
0.39 | 0 | 7 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wenwen Ye | 1 | 1 | 1.40 |
Yiding Liu | 2 | 7 | 3.19 |
Lixin Zou | 3 | 14 | 1.70 |
Hengyi Cai | 4 | 4 | 1.14 |
Suqi Cheng | 5 | 5 | 2.19 |
Shuaiqiang Wang | 6 | 254 | 22.72 |
Dawei Yin | 7 | 866 | 61.99 |