Title
Gesture triggered, dynamic gaze alignment architecture for intelligent eLearning systems.
Abstract
Current eLearning systems enable streaming of live lectures to distant students facilitating a live instructor-student interaction. However, studies have shown that there exists a marked divide in local students' (student present in the teacher's location) experience as compared to distant students'. One of the major factors attributing to this rift is lack of gaze aligned interaction. In this paper, we present a system architecture that receives gesture triggers as input, and dynamically calculates the perspective angle to be captured of the speaking participant, for the listener, facilitating eye contact. The gesture triggers are calculated using Microsoft Kinect sensor which extracts skeleton joint information of the instructor, and performs gesture recognition with the acquired joint information real-time. This serves as interaction-initiation triggers for dynamic perspective correction for gaze alignment during a conversation. For evaluation, we constructed a five classroom test-bed with dynamic perspective correction and user study results indicate a marked 42% enhancement in experience with the gaze correction in place.
Year
DOI
Venue
2017
10.3233/JIFS-169239
JOURNAL OF INTELLIGENT & FUZZY SYSTEMS
Keywords
Field
DocType
Gaze correction,eye contact,gesture recognition,video streaming,eLearning
Architecture,Gaze,Gesture,Computer science,Multimedia
Journal
Volume
Issue
ISSN
32
4
1064-1246
Citations 
PageRank 
References 
2
0.51
4
Authors
4
Name
Order
Citations
PageRank
Ramkumar N151.06
V. Rangan242.54
Uma Gopalakrishnan382.94
Balaji Hariharan4155.87