Title
A Multi-View Hand Gesture Rgb-D Dataset For Human-Robot Interaction Scenarios
Abstract
Understanding semantic meaning from hand gestures is a challenging but essential task in human-robot interaction scenarios. In this paper we present a baseline evaluation of the Innsbruck Multi-View Hand Gesture (IMHG) dataset [1] recorded with two RGB-D cameras (Kinect). As a baseline, we adopt a probabilistic appearance-based framework [2] to detect a hand gesture and estimate its pose using two cameras. The dataset consists of two types of deictic gestures with the ground truth location of the target, two symbolic gestures, two manipulative gestures, and two interactional gestures. We discuss the effect of parallax due to the offset between head and hand while performing deictic gestures. Furthermore, we evaluate the proposed framework to estimate the potential referents on the Innsbruck Pointing at Objects (IPO) dataset [2].
Year
Venue
Field
2016
2016 25TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN)
Computer vision,Parallax,Gesture,Computer science,Simulation,Gesture recognition,Pose,Ground truth,Artificial intelligence,Deixis,Probabilistic logic,Human–robot interaction
DocType
ISSN
Citations 
Conference
1944-9445
1
PageRank 
References 
Authors
0.35
20
3
Name
Order
Citations
PageRank
Dadhichi Shukla1213.11
Özgür Erkent2264.96
Justus H. Piater354361.56