Title
Prediction of music pairwise preferences from facial expressions
Abstract
Users of a recommender system may be requested to express their preferences about items either with evaluations of items (e.g. a rating) or with comparisons of item pairs. In this work we focus on the acquisition of pairwise preferences in the music domain. Asking the user to explicitly compare music, i.e., which, among two listened tracks, is preferred, requires some user effort. We have therefore developed a novel approach for automatically extracting these preferences from the analysis of the facial expressions of the users while listening to the compared tracks. We have trained a predictor that infers user's pairwise preferences by using features extracted from these data. We show that the predictor performs better than a commonly used baseline, which leverages the user's listening duration of the tracks to infer pairwise preferences. Furthermore, we show that there are differences in the accuracy of the proposed method between users with different personalities and we have therefore adapted the trained model accordingly. Our work shows that by introducing a low user effort preference elicitation approach, which, however, requires to access information that may raise potential privacy issues (face expression), one can obtain good prediction accuracy of pairwise music preferences.
Year
DOI
Venue
2019
10.1145/3301275.3302266
Proceedings of the 24th International Conference on Intelligent User Interfaces
Keywords
Field
DocType
emotions, facial expressions, implicit preference elicitation, pairwise scores
Recommender system,Pairwise comparison,Preference elicitation,Computer science,Active listening,Human–computer interaction,Facial expression
Conference
ISBN
Citations 
PageRank 
978-1-4503-6272-6
2
0.36
References 
Authors
19
6
Name
Order
Citations
PageRank
Marko Tkalcic132933.68
Nima Maleki220.36
Matevz Pesek372.21
Mehdi Elahi440829.41
Francesco Ricci52597165.86
Matija Marolt614419.41