Title
Cross-Batch Negative Sampling for Training Two-Tower Recommenders
Abstract
ABSTRACTThe two-tower architecture has been widely applied for learning item and user representations, which is important for large-scale recommender systems. Many two-tower models are trained using various in-batch negative sampling strategies, where the effects of such strategies inherently rely on the size of mini-batches. However, training two-tower models with a large batch size is inefficient, as it demands a large volume of memory for item and user contents and consumes a lot of time for feature encoding. Interestingly, we find that neural encoders can output relatively stable features for the same input after warming up in the training process. Based on such facts, we propose a simple yet effective sampling strategy called Cross-Batch Negative Sampling (CBNS), which takes advantage of the encoded item embeddings from recent mini-batches to boost the model training. Both theoretical analysis and empirical evaluations demonstrate the effectiveness and the efficiency of CBNS.
Year
DOI
Venue
2021
10.1145/3404835.3463032
IR
Keywords
DocType
Citations 
recommender systems, information retrieval, neural networks
Conference
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Jinpeng Wang103.38
Jieming Zhu2445.27
Xiuqiang He331239.21