Title
Topology Distillation for Recommender System
Abstract
ABSTRACTRecommender Systems (RS) have employed knowledge distillation which is a model compression technique training a compact student model with the knowledge transferred from a pre-trained large teacher model. Recent work has shown that transferring knowledge from the teacher's intermediate layer significantly improves the recommendation quality of the student. However, they transfer the knowledge of individual representation point-wise and thus have a limitation in that primary information of RS lies in the relations in the representation space. This paper proposes a new topology distillation approach that guides the student by transferring the topological structure built upon the relations in the teacher space. We first observe that simply making the student learn the whole topological structure is not always effective and even degrades the student's performance. We demonstrate that because the capacity of the student is highly limited compared to that of the teacher, learning the whole topological structure is daunting for the student. To address this issue, we propose a novel method named Hierarchical Topology Distillation (HTD) which distills the topology hierarchically to cope with the large capacity gap. Our extensive experiments on real-world datasets show that the proposed method significantly outperforms the state-of-the-art competitors. We also provide in-depth analyses to ascertain the benefit of distilling the topology for RS.
Year
DOI
Venue
2021
10.1145/3447548.3467319
Knowledge Discovery and Data Mining
Keywords
DocType
Citations 
Recommender System, Knowledge Distillation, Relational Knowledge, Model Compression, Retrieval efficiency
Conference
1
PageRank 
References 
Authors
0.38
0
4
Name
Order
Citations
PageRank
SeongKu Kang1214.55
Junyoung Hwang2163.42
Wonbin Kweon342.48
Hwanjo Yu41715114.02