Title
MTRec: Multi-Task Learning over BERT for News Recommendation
Abstract
Existing news recommendation methods usually learn news representations solely based on news titles. To sufficiently utilize other fields of news information such as category and entities, some methods treat each field as an additional feature and combine different feature vectors with attentive pooling. With the adoption of large pre-trained models like BERT in news recommendation, the above way to incorporate multi-field information may encounter challenges: the shallow feature encoding to compress the category and entity information is not compatible with the deep BERT encoding. In this paper, we propose a multi-task learning framework to incorporate the multi-field information into BERT, which improves its news encoding capability. Besides, we modify the gradients of different tasks based on their gradient conflicts, which further boosts the model performance. Extensive experiments on the MIND news recommendation benchmark show the effectiveness of our approach.
Year
DOI
Venue
2022
10.18653/v1/2022.findings-acl.209
FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022)
DocType
Volume
Citations 
Conference
Findings of the Association for Computational Linguistics: ACL 2022
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Qiwei Bi100.34
Jian Li232.74
Lifeng Shang348530.96
Xin Jiang415032.43
Qun Liu52149203.11
Hanfang Yang600.68