Title
Pointing Estimation for Human-Robot Interaction Using Hand Pose, Verbal Cues, and Confidence Heuristics.
Abstract
People utilize pointing directives frequently and effortlessly. Robots, therefore, will need to interpret these directives in order to understand the intention of the user. This is not a trivial task as the intended pointing direction rarely aligns with the ground truth pointing vector. Standard methods require head, arm, and hand pose estimation inhibiting more complex pointing gestures that can be found in human-human interactions. In this work, we aim to interpret these pointing directives by using the pose of the index finger in order to capture both simple and complex gestures. Furthermore, this method can act as a fall-back for when full-body pose information is not available. This paper demonstrates the ability of a robot to determine pointing direction using data collected from a Microsoft Kinect camera. The finger joints are detected in 3D-space and used in conjugation with verbal cues from the user to determine the point of interest (POI). In addition to this, confidence heuristics are provided to determine the quality of the source information, whether verbal or physical. We evaluated the performance of using these features with a support vector machine, decision tree, and a generalized model which does not rely on a learning algorithm.
Year
DOI
Venue
2018
10.1007/978-3-319-91485-5_31
Lecture Notes in Computer Science
Keywords
Field
DocType
Pointing,Object detection,Object localization,Social interaction
Decision tree,Data mining,Computer science,Gesture,Support vector machine,Pose,Human–computer interaction,Heuristics,Point of interest,Robot,Human–robot interaction
Conference
Volume
ISSN
Citations 
10914
0302-9743
0
PageRank 
References 
Authors
0.34
10
2
Name
Order
Citations
PageRank
Andrew Showers100.34
Mei Si225924.87