Title
DML: Dynamic Multi-Granularity Learning for BERT-Based Document Reranking
Abstract
BSTRACTRecently, pre-trained language models have been successfully applied to the task of text retrieval and ranking. However, in real scenes, users' click behavior is usually affected by selection, position, or exposure bias, which may lead to insufficient positive annotations and introduce additional noise. And for different candidate documents of the same query, the previous optimization objectives usually use a single granularity and static loss weights. It makes the performance of ranking models more susceptible to the bias issue mentioned above. Thus, in this paper, we focus on BERT-based document reranking and propose Dynamic Multi-Granularity Learning (DML). By introducing Gaussian distribution into traditional loss functions, the weights of different documents can change dynamically according to the prediction probability to avoid the impact of unlabeled positive documents. Besides, both document-granularity and instance-granularity are considered to balance relative relations and absolute scores of candidate documents. Extensive experiments show that DML significantly outperforms previous state-of-the-art models on the MS MARCO document ranking dataset.
Year
DOI
Venue
2021
10.1145/3459637.3482090
Conference on Information and Knowledge Management
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
2
Name
Order
Citations
PageRank
Xuanyu Zhang100.68
Qing Yang201.35