Abstract | ||
---|---|---|
Drawing attention to objects and their localization in the environment are essential building blocks for domestic robot applications, e.g. fetch-and-delivery or navigation tasks. For this purpose, human pointing gestures turned out to be a natural and intuitive interaction method to transfer the spatial data of an object from human to robot. Current approaches only use the robot's on-board sensors to perceive gesture-based instructions, which restricts them to the field of view of the robot's camera. The integration of mobile robots into intelligent environments, such as smart homes, opens new possibilities to overcome this limitation by utilizing components of the surrounding environment as additional sensors. We take advantage of these new possibilities and propose a multi-stage object localization system based on human pointing gestures that considers the whole intelligent environment as interaction partner. Our experimental results show that our multi-stage approach successfully refines the position initially proposed by a human pointing gesture by employing a distributed camera network integrated into the environment for object localization. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1109/IE.2018.00015 | 2018 14th International Conference on Intelligent Environments (IE) |
Keywords | Field | DocType |
Human pointing gestures,Object localization,Intelligent environment,Distributed camera network | Spatial analysis,Intelligent environment,Field of view,Gesture,Computer science,Domestic robot,Human–computer interaction,Probabilistic logic,Robot,Mobile robot | Conference |
ISSN | ISBN | Citations |
2469-8792 | 978-1-5386-6845-0 | 2 |
PageRank | References | Authors |
0.39 | 0 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Dennis Sprute | 1 | 12 | 5.35 |
Robin Rasch | 2 | 7 | 1.84 |
Aljoscha Pörtner | 3 | 4 | 1.44 |
Sven Battermann | 4 | 5 | 1.81 |
Matthias König | 5 | 2 | 0.73 |