Title
Coarse-to-Fine Sparse Sequential Recommendation
Abstract
Sequential recommendation aims to model dynamic user behavior from historical interactions. Self-attentive methods have proven effective at capturing short-term dynamics and long-term preferences. Despite their success, these approaches still struggle to model sparse data, on which they struggle to learn high-quality item representations. We propose to model user dynamics from shopping intents and interacted items simultaneously. The learned intents are coarse-grained and work as prior knowledge for item recommendation. To this end, we present a coarse-to-fine self-attention framework, namely CaFe, which explicitly learns coarse-grained and fine-grained sequential dynamics. Specifically, CaFe first learns intents from coarse-grained sequences which are dense and hence provide high-quality user intent representations. Then, CaFe fuses intent representations into item encoder outputs to obtain improved item representations. Finally, we infer recommended items based on representations of items and corresponding intents. Experiments on sparse datasets show that CaFe outperforms state-of-the-art self-attentive recommenders by 44.03% [email protected] on average.
Year
DOI
Venue
2022
10.1145/3477495.3531732
SIGIR '22: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
Keywords
DocType
Citations 
Sequential Recommendation, Transformer, Sparse Data, Coarse-to-Fine Framework
Conference
0
PageRank 
References 
Authors
0.34
10
8
Name
Order
Citations
PageRank
Jiacheng Li100.68
Tong Zhao222014.25
Jin Li341.41
Jim Chan431.06
Christos Faloutsos5279724490.38
George Karypis6156911171.82
Soo-Min Pantel700.34
Julian John McAuley82856115.30