Title
Item-side ranking regularized distillation for recommender system
Abstract
Recent recommender system (RS) have adopted large and sophisticated model architecture to better understand the complex user-item relationships, and accordingly, the size of the recommender is continuously increasing. To reduce the high inference costs of the large recommender, knowledge distillation (KD), which is a model compression technique from a large pre-trained model (teacher) to a small model (student), has been actively studied for RS. The state-of-the-art method is based on the ranking distillation approach, which makes the student preserve the ranking orders among items predicted by the teacher. In this work, we propose a new regularization method designed to maximize the effect of the ranking distillation in RS. We first point out an important limitation and a room for improvement of the state-of-the-art ranking distillation method based on our in-depth analysis.Then, we introduce the item-side ranking regularization, which can effectively prevent the student with limited capacity from being overfitted and enables the student to more accurately learn the teacher’s prediction results. We validate the superiority of the proposed method by extensive experiments on real-world datasets.
Year
DOI
Venue
2021
10.1016/j.ins.2021.08.060
Information Sciences
Keywords
DocType
Volume
Recommender system,Knowledge distillation,Learning to rank,Ranking regularization,Model compression,Retrieval efficiency,Item-side ranking,Dual-side ranking
Journal
580
ISSN
Citations 
PageRank 
0020-0255
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
SeongKu Kang1214.55
Junyoung Hwang2163.42
Wonbin Kweon342.48
Hwanjo Yu41715114.02