Title
Visually Interpretable Representation Learning for Depression Recognition from Facial Images
Abstract
Recent evidence in mental health assessment have demonstrated that facial appearance could be highly indicative of depressive disorder. While previous methods based on the facial analysis promise to advance clinical diagnosis of depressive disorder in a more efficient and objective manner, challenges in visual representation of complex depression pattern prevent widespread practice of automated depression diagnosis. In this paper, we present a deep regression network termed DepressNet to learn a depression representation with visual explanation. Specifically, a deep convolutional neural network equipped with a global average pooling layer is first trained with facial depression data, which allows for identifying salient regions of input image in terms of its severity score based on the generated depression activation map (DAM). We then propose a multi-region DepressNet, with which multiple local deep regression models for different face regions are jointly leaned and their responses are fused to improve the overall recognition performance. We evaluate our method on two benchmark datasets, and the results show that our method significantly boosts state-of-the-art performance of the visual-based depression recognition. Most importantly, the DAM induced by our learned deep model may help reveal the visual depression pattern on faces and understand the insights of automated depression diagnosis.
Year
DOI
Venue
2020
10.1109/TAFFC.2018.2828819
IEEE Transactions on Affective Computing
Keywords
DocType
Volume
Depression recognition,face recognition,deep convolutional neural network,depression activation map
Journal
11
Issue
ISSN
Citations 
3
1949-3045
8
PageRank 
References 
Authors
0.54
22
4
Name
Order
Citations
PageRank
Xiuzhuang Zhou138020.26
Kai Jin2285.66
Yuanyuan Shang321016.83
Guodong Guo42548144.00