Title
Attentive Autoencoders for Multifaceted Preference Learning in One-class Collaborative Filtering
Abstract
Most existing One-Class Collaborative Filtering (OC-CF) algorithms estimate a user's preference as a latent vector by encoding their historical interactions. However, users often show diverse interests, which significantly increases the learning difficulty. In order to capture multifaceted user preferences, existing recommender systems either increase the encoding complexity or extend the latent representation dimension. Unfortunately, these changes inevitably lead to increased training difficulty and exacerbate scalability issues. In this paper, we propose a novel and efficient CF framework called Attentive Multimodal AutoRec (AMA) that explicitly tracks multiple facets of user preferences. Specifically, we extend the Autoencoding-based recommender AutoRec to learn user preferences with multimodal latent representations, where each mode captures one facet of a user's preferences. By leveraging the attention mechanism, each observed interaction can have different contributions to the preference facets. Through extensive experiments on three realworld datasets, we show that AMA is competitive with state-of-the-art models under the OC-CF setting. Also, we demonstrate how the proposed model improves interpretability by providing explanations using the attention mechanism.
Year
DOI
Venue
2020
10.1109/ICDMW51313.2020.00032
2020 International Conference on Data Mining Workshops (ICDMW)
Keywords
DocType
ISSN
One-class Collaborative Filtering,Attention Model,Multifaceted Preference Recommendation
Conference
2375-9232
ISBN
Citations 
PageRank 
978-1-7281-9013-6
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Zheda Mai101.35
Ga Wu2206.42
Kai Luo331.40
Scott Sanner4196.35