Title
Pre-train a Discriminative Text Encoder for Dense Retrieval via Contrastive Span Prediction
Abstract
Dense retrieval has shown promising results in many information retrieval (IR) related tasks, whose foundation is high-quality text representation learning for effective search. Some recent studies have shown that autoencoder-based language models are able to boost the dense retrieval performance using a weak decoder. However, we argue that 1) it is not discriminative to decode all the input texts and, 2) even a weak decoder has the bypass effect on the encoder. Therefore, in this work, we introduce a novel contrastive span prediction task to pre-train the encoder alone, but still retain the bottleneck ability of the autoencoder. In this way, we can 1) learn discriminative text representations efficiently with the group-wise contrastive learning over spans and, 2) avoid the bypass effect of the decoder thoroughly. Comprehensive experiments over publicly available retrieval benchmark datasets show that our approach can outperform existing pre-training methods for dense retrieval significantly.
Year
DOI
Venue
2022
10.1145/3477495.3531772
SIGIR '22: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
Keywords
DocType
Citations 
Dense Retrieval, Pre-training for IR, Discriminative Representation
Conference
0
PageRank 
References 
Authors
0.34
11
5
Name
Order
Citations
PageRank
Xinyu Ma181.53
Jiafeng Guo21737102.17
Ruqing Zhang31510.40
Yixing Fan420219.39
Xueqi Cheng53148247.04