Title
Probabilistic Detection of Pointing Directions for Human-Robot Interaction
Abstract
Deictic gestures - pointing at things in human-human collaborative tasks - constitute a pervasive, non-verbal way of communication, used e.g. to direct attention towards objects of interest. In a human-robot interactive scenario, in order to delegate tasks from a human to a robot, one of the key requirements is to recognize and estimate the pose of the pointing gesture. Standard approaches rely on full-body or partial-body postures to detect the pointing direction. We present a probabilistic, appearance-based object detection framework to detect pointing gestures and robustly estimate the pointing direction. Our method estimates the pointing direction without assuming any human kinematic model. We propose a functional model for pointing which incorporates two types of pointing, finger pointing and tool pointing using an object in hand. We evaluate our method on a new dataset with 9 participants pointing at 10 objects.
Year
DOI
Venue
2015
10.1109/DICTA.2015.7371296
2015 International Conference on Digital Image Computing: Techniques and Applications (DICTA)
Keywords
Field
DocType
human-robot interaction,deictic gestures,human-human collaborative tasks,pointing gesture pose estimation,probabilistic appearance-based object detection framework,pointing gesture detection,finger pointing,tool pointing
Computer vision,Object detection,Kinematics,Computer science,Gesture,Delegate,Gesture recognition,Artificial intelligence,Probabilistic logic,Robot,Human–robot interaction
Conference
Citations 
PageRank 
References 
7
0.47
14
Authors
3
Name
Order
Citations
PageRank
Dadhichi Shukla1213.11
Özgür Erkent2264.96
Justus H. Piater354361.56