Abstract | ||
---|---|---|
We explore new aspects on assistive living via smart social human-robot interaction (HRI) involving automatic recognition of multimodal gestures and speech in a natural interface, providing social features in HRI. We discuss a whole framework of resources, including datasets and tools, briefly shown in two real-life use cases for elderly subjects: a multimodal interface of an assistive robotic rollator and an assistive bathing robot. We discuss these domain specific tasks, and open source tools, which can be used to build such HRI systems, as well as indicative results. Sharing such resources can open new perspectives in assistive HRI. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1145/3029798.3038400 | HRI (Companion) |
Keywords | Field | DocType |
assistive HRI, multimodal audio-gestural recognition | Use case,Simulation,Computer science,Gesture,Human–computer interaction,Robot,Natural user interface,Human–robot interaction | Conference |
ISSN | Citations | PageRank |
2167-2121 | 1 | 0.36 |
References | Authors | |
6 | 8 |
Name | Order | Citations | PageRank |
---|---|---|---|
Athanasia Zlatintsi | 1 | 46 | 4.49 |
Rodomagoulakis, I. | 2 | 18 | 3.45 |
Vassilis Pitsikalis | 3 | 118 | 10.44 |
Petros Koutras | 4 | 16 | 6.35 |
Nikolaos Kardaris | 5 | 6 | 1.20 |
Xanthi Papageorgiou | 6 | 46 | 10.83 |
Costas S. Tzafestas | 7 | 153 | 25.95 |
Petros Maragos | 8 | 3733 | 591.97 |