Title
Fused features mining for depth-based hand gesture recognition to classify blind human communication.
Abstract
Gesture recognition and hand pose tracking are applicable techniques in human–computer interaction fields. Depth data obtained by depth cameras present a very informative explanation of the body or in particular hand pose that it can be used for more accurate gesture recognition systems. The hand detection and feature extraction process are very challenging task in the RGB images that they can be effectively dissolved with simple ways with depth data. However, depth data could be combined with the color information for more reliable recognition. A common hand gesture recognition system requires identifying the hand and its position or direction, extracting some useful features and applying a suitable machine-learning method to detect the performed gesture. This paper presents the novel fusion of the enhanced features for the classification of static signs of the sign language. It begins by explaining how the hand can be separated from the scene by depth data. Then, a combination feature extraction method is introduced for extracting some appropriate features of the images. Finally, an artificial neural network classifier is trained with these fused features and applied to critically analyze various descriptors performance.
Year
DOI
Venue
2017
10.1007/s00521-016-2244-5
Neural Computing and Applications
Keywords
Field
DocType
Hand gesture recognition, Depth data, DCT, Moment invariant, Fused features mining
Computer vision,Pattern recognition,Computer science,Gesture,Discrete cosine transform,Gesture recognition,Feature extraction,Sign language,RGB color model,Artificial intelligence,Human communication,Artificial neural network classifier
Journal
Volume
Issue
ISSN
28
11
1433-3058
Citations 
PageRank 
References 
6
0.43
16
Authors
5
Name
Order
Citations
PageRank
Saba Jadooki160.43
Dzulkifli Mohamad29613.41
Tanzila Saba332647.33
Abdulaziz S. Almazyad4205.84
Amjad Rehman518123.00