Title
Visual-based human-machine interface using hand gestures
Abstract
This paper presents a new paradigm for visual-based interaction with computers using body gestures. The paradigm is based on statistical classification for gesture selection. It has applications in daily interaction with computers, computer games, telemedicine, virtual reality, and sign language studies. Specifically, hand gesture selection and recognition is considered as an example. The aims of this paper are: (a) how to select an appropriate set of gestures having a satisfactory level of discrimination power, and (b) comparison of invariant moments (conventional and Zernike) and geometric properties in recognizing hand gestures. Two-dimensional structures, namely cluster-property and cluster-features matrices, have been employed for gesture selection and to evaluate different gesture characteristics. Experimental results confirm better performance of the geometric features compared to moment invariants and Zernike moments.
Year
DOI
Venue
2007
10.1109/ISSPA.2007.4555302
Sharjah
Keywords
Field
DocType
gesture recognition,human computer interaction,image classification,Zernike moment invariant,computer games,hand gesture recognition,hand gesture selection,sign language study,statistical classification,telemedicine,virtual reality,visual-based human-machine interface
Computer vision,Gesture,Computer science,Gesture recognition,Feature extraction,Zernike polynomials,Speech recognition,Sign language,Artificial intelligence,Hidden Markov model,Statistical classification,Contextual image classification
Conference
ISBN
Citations 
PageRank 
978-1-4244-1779-8
1
0.37
References 
Authors
5
2
Name
Order
Citations
PageRank
Abdolah Chalechale1737.12
Golshah Naghdy2299.36