Title
Quantifying the Relationships between Everyday Objects and Emotional States through Deep Learning Based Image Analysis Using Smartphones
Abstract
There has been an increasing interest in the problem of inferring emotional states of individuals using sensor and user-generated information as diverse as GPS traces, social media data and smartphone interaction patterns. One aspect that has received little attention is the use of visual context information extracted from the surroundings of individuals and how they relate to it. In this paper, we present an observational study of the relationships between the emotional states of individuals and objects present in their visual environment automatically extracted from smartphone images using deep learning techniques. We developed MyMood, a smartphone application that allows users to periodically log their emotional state together with pictures from their everyday lives, while passively gathering sensor measurements. We conducted an in-the-wild study with 22 participants and collected 3,305 mood reports with photos. Our findings show context-dependent associations between objects surrounding individuals and self-reported emotional state intensities. The applications of this work are potentially many, from the design of interior and outdoor spaces to the development of intelligent applications for positive behavioral intervention, and more generally for supporting computational psychology studies.
Year
DOI
Venue
2020
10.1145/3380997
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Keywords
DocType
Volume
Deep Learning,Digital Mental Health,Mobile Sensing
Journal
4
Issue
ISSN
Citations 
1
2474-9567
1
PageRank 
References 
Authors
0.35
10
4
Name
Order
Citations
PageRank
Victor Darvariu130.74
Laura Convertino210.35
Abhinav Mehrotra316911.69
Mirco Musolesi43365204.65