Title
Understanding the Behaviors of BERT in Ranking.
Abstract
This paper studies the performances and behaviors of BERT in ranking tasks. We explore several different ways to leverage the pre-trained BERT and fine-tune it on two ranking tasks: MS MARCO passage reranking and TREC Web Track ad hoc document ranking. Experimental results on MS MARCO demonstrate the strong effectiveness of BERT in question-answering focused passage ranking tasks, as well as the fact that BERT is a strong interaction-based seq2seq matching model. Experimental results on TREC show the gaps between the BERT pre-trained on surrounding contexts and the needs of ad hoc document ranking. Analyses illustrate how BERT allocates its attentions between query-document tokens in its Transformer layers, how it prefers semantic matches between paraphrase tokens, and how that differs with the soft match patterns learned by a click-trained neural ranker.
Year
Venue
DocType
2019
arXiv: Information Retrieval
Journal
Volume
Citations 
PageRank 
abs/1904.07531
3
0.38
References 
Authors
0
4
Name
Order
Citations
PageRank
Yifan Qiao163.50
Chen-Yan Xiong240530.82
Zheng-Hao Liu393.20
Zhiyuan Liu42037123.68