Title
Comparative Analysis of Facial Affect Detection Algorithms.
Abstract
There has been much research on facial affect detection, but many of them fall short on accurately identifying expressions, due to changes in illumination, occlusion, or noise in uncontrolled environments. Also, not much research has been conducted on implementing the algorithms using multiple datasets, varying the size of dataset and the dimension of each image in the dataset. Our ultimate goal is to develop an optimized algorithm that can be used for real-time affect detection in automated vehicles. To this end, in this study we implemented the facial affect detection algorithms with various datasets and conducted a comparative analysis of performance across the algorithms. The algorithms implemented in the study included a Convolutional Neural Network (CNN) in Tensorflow, FaceNet using Transfer Learning, and Capsule Network. Each of these algorithms was trained using the three datasets (FER2013, CK+, and Ohio) to get the predicted results. The Capsule Network showed the best detection accuracy (99.3%) with the CK+ dataset. Results are discussed with implications and future work.
Year
DOI
Venue
2020
10.1109/SMC42975.2020.9283205
SMC
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
2
Name
Order
Citations
PageRank
Ashin Marin Thomas100.34
Myounghoon Jeon211336.51