Title
Head gesture recognition using feature interpolation
Abstract
This paper addresses a technique of recognizing a head gesture. The proposed system is composed of eye tracking and head motion decision. The eye tracking step is divided into face detection, eye location and eye feature interpolation. Face detection obtains the face region using integrated feature space. Multiple Bayesian classifiers are employed for selection of face candidate windows on integrated feature space. Eye location extracts the location of eyes from the detected face region. Eye location is performed at the region close to a pair of eyes for real-time eye tracking. If a pair of eyes is not located, the system can estimate feature vector using mean velocity measure(MVM). After eye tracking, the coordinates of the detected eyes are transformed into the normalized vector of the x-coordinate and the y-coordinate. Head gesture recognition using HMMs. Head gesture can be recognized by HMMs those are adapted by a directional vector. The directional vector represents the direction of head movement. The HMMs vector can also be used to determine neutral as well as positive and negative gesture. The experimental results are reported. These techniques are implemented on a lot of images and a notable success is notified.
Year
DOI
Venue
2006
10.1007/11892960_70
KES (1)
Keywords
Field
DocType
real-time eye tracking,face detection,eye tracking step,eye feature interpolation,eye location,integrated feature space,face region,directional vector,eye tracking,head gesture,head gesture recognition,real time,feature vector,feature space,gesture recognition,bayesian classifier
Facial recognition system,Computer vision,Feature vector,Computer science,Direction vector,Gesture recognition,Eye tracking,Artificial intelligence,Face detection,Motion estimation,Unit vector
Conference
Volume
ISSN
ISBN
4251
0302-9743
3-540-46535-9
Citations 
PageRank 
References 
0
0.34
9
Authors
2
Name
Order
Citations
PageRank
Yeon Gu Kang160.78
Phill Kyu Rhee26024.82