Title
Perception for Everyday Human Robot Interaction.
Abstract
The ability to build robotic agents that can perform everyday tasks heavily depends on understanding how humans perform them. In order to achieve close to human understanding of a task and generate a formal representation of it, it is important to jointly reason about the human actions and the objects that are being acted on. We present a robotic perception framework for perceiving actions performed by a human in a household environment that can be used to answer questions such as “which object did the human act on?” or “which actions did the human perform?”. To do this we extend the RoboSherlock framework with the capabilities of detecting humans and objects at the same time, while simultaneously reasoning about the possible actions that are being performed.
Year
DOI
Venue
2016
10.1007/s13218-015-0400-1
KI
Keywords
Field
DocType
Object Detection, Action Recognition, Belief State, Object Label, Robotic Agent
Computer vision,Object detection,Everyday tasks,Computer science,Formal representation,Action recognition,Human–computer interaction,Artificial intelligence,Perception,Machine learning,Human–robot interaction
Journal
Volume
Issue
ISSN
30
1
1610-1987
Citations 
PageRank 
References 
2
0.41
11
Authors
3
Name
Order
Citations
PageRank
Jan-Hendrik Worch130.76
Ferenc Balint-Benczedi2576.03
Michael Beetz33784284.03