Title
Deep Multifaceted Transformers for Multi-objective Ranking in Large-Scale E-commerce Recommender Systems
Abstract
Recommender Systems have been playing essential roles in e-commerce portals. Existing recommendation algorithms usually learn the ranking scores of items by optimizing a single task (e.g. Click-through rate prediction) based on users' historical click sequences, but they generally pay few attention to simultaneously modeling users' multiple types of behaviors or jointly optimize multiple objectives (e.g. both Click-through rate and Conversion rate), which are both vital for e-commerce sites. In this paper, we argue that it is crucial to formulate users' different interests based on multiple types of behaviors and perform multi-task learning for significant improvement in multiple objectives simultaneously. We propose Deep Multifaceted Transformers (DMT), a novel framework that can model users' multiple types of behavior sequences simultaneously with multiple Transformers. It utilizes Multi-gate Mixture-of-Experts to optimize multiple objectives. Besides, it exploits unbiased learning to reduce the selection bias in the training data. Experiments on JD real production dataset demonstrate the effectiveness of DMT, which significantly outperforms state-of-art methods. DMT has been successfully deployed to serve the main traffic in the commercial Recommender System in JD.com. To facilitate future research, we release the codes and datasets at https://github.com/guyulongcs/CIKM2020_DMT.
Year
DOI
Venue
2020
10.1145/3340531.3412697
CIKM '20: The 29th ACM International Conference on Information and Knowledge Management Virtual Event Ireland October, 2020
DocType
ISBN
Citations 
Conference
978-1-4503-6859-9
2
PageRank 
References 
Authors
0.36
33
6
Name
Order
Citations
PageRank
Yulong Gu1272.55
Zhuoye Ding215011.23
Shuaiqiang Wang325422.72
Lixin Zou4141.70
Yiding Liu573.19
Dawei Yin686661.99