Title
Deformation modeling and classification using deep convolutional neural networks for computerized analysis of neuropsychological drawings
Abstract
Drawing-based tests are cost-effective, noninvasive screening methods, popularly employed by psychologists for the early detection and diagnosis of various neuropsychological disorders. Computerized analysis of such drawings is a complex task due to the high degree of deformations present in the responses and reliance on extensive clinical manifestations for their inferences. Traditional rule-based approaches employed in visual analysis-based systems prove insufficient to model all possible clinical deformations. Meanwhile, procedural analysis-based techniques may contradict with the standard test conduction and evaluation protocols. Leveraging on the increasing popularity of convolutional neural networks (CNNs), we propose an effective technique for modeling and classifying dysfunction indicating deformations in drawings without modifying clinical standards. Contrary to conventional sketch recognition applications where CNNs are trained to diminish intra-shape class variations, we employ deformation-specific augmentation to enhance the presence of specific deviations that are defined by clinical practitioners. The performance of our proposed technique is evaluated using Lacks' scoring of the Bender-Gestalt test, as a case study. The results of our experimentation substantiate that our proposed approach can represent domain knowledge sufficiently without extensive heuristics and can effectively identify drawing-based biomarkers for various neuropsychological disorders.
Year
DOI
Venue
2020
10.1007/s00521-020-04735-8
NEURAL COMPUTING & APPLICATIONS
Keywords
DocType
Volume
Neuropsychological drawings,Deformation classification,Deep visual features,Bender-Gestalt test
Journal
32.0
Issue
ISSN
Citations 
16
0941-0643
1
PageRank 
References 
Authors
0.35
0
4
Name
Order
Citations
PageRank
Momina Moetesum111.70
Imran Siddiqi242136.56
Shoaib Ehsan311024.43
Nicole Vincent421826.66