Title
Learning Semantics of Gestural Instructions for Human-Robot Collaboration.
Abstract
Designed to work safely alongside humans, collaborative robots need to be capable partners in human-robot teams. Besides having key capabilities like detecting gestures, recognizing objects, grasping them, and handing them over, these robots need to seamlessly adapt their behavior for efficient human-robot collaboration. In this context we present the fast, supervised Proactive Incremental Learning (PIL) framework for learning associations between human hand gestures and the intended robotic manipulation actions. With the proactive aspect, the robot is competent to predict the human's intent and perform an action without waiting for an instruction. The incremental aspect enables the robot to learn associations on the fly while performing a task. It is a probabilistic, statistically-driven approach. As a proof of concept, we focus on a table assembly task where the robot assists its human partner. We investigate how the accuracy of gesture detection affects the number of interactions required to complete the task. We also conducted a human-robot interaction study with non-roboticist users comparing a proactive with a reactive robot that waits for instructions.
Year
DOI
Venue
2018
10.3389/fnbot.2018.00007
FRONTIERS IN NEUROROBOTICS
Keywords
Field
DocType
human-robot collaboration,proactive learning,gesture understanding,intention prediction,user study
Computer science,Gesture,Gesture recognition,Human–computer interaction,Proof of concept,Artificial intelligence,Probabilistic logic,Robot,Semantics,Machine learning,Human–robot interaction,Proactive learning
Journal
Volume
ISSN
Citations 
12
1662-5218
0
PageRank 
References 
Authors
0.34
19
3
Name
Order
Citations
PageRank
Dadhichi Shukla1213.11
Özgür Erkent2264.96
Justus H. Piater354361.56