Title
LaPraDoR: Unsupervised Pretrained Dense Retriever for Zero-Shot Text Retrieval
Abstract
In this paper, we propose LaPraDoR, a pretrained dual-tower dense retriever that does not require any supervised data for training. Specifically, we first present Iterative Contrastive Learning (ICoL) that iteratively trains the query and document encoders with a cache mechanism. ICoL not only enlarges the number of negative instances but also keeps representations of cached examples in the same hidden space. We then propose Lexicon-Enhanced Dense Retrieval (LEDR) as a simple yet effective way to enhance dense retrieval with lexical matching. We evaluate LaPraDoR on the recently proposed BEIR benchmark, including 18 datasets of 9 zeroshot text retrieval tasks. Experimental results show that LaPraDoR achieves state-of-the-art performance compared with supervised dense retrieval models, and further analysis reveals the effectiveness of our training strategy and objectives. Compared to re-ranking, our lexiconenhanced approach can be run in milliseconds (22.5x faster) while achieving superior performance.(1)
Year
DOI
Venue
2022
10.18653/v1/2022.findings-acl.281
FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022)
DocType
Volume
Citations 
Conference
Findings of the Association for Computational Linguistics: ACL 2022
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Canwen Xu153.80
Daya Guo264.81
Nan Duan321345.87
Julian John McAuley42856115.30