Abstract | ||
---|---|---|
In this paper, we describe a robot that interacts with humans in a crowded conference environment. The robot detects faces, determines the shirt color of onlooking conference attendants, and reacts with a combination of speech, musical, and movement responses. It continuously updates an internal emotional state, modeled realistically after human psychology research. Using empirically-determined mapping functions, the robot's state in the emotion space is translated to a particular set of sound and movement responses. We successfully demonstrate this system at the AAAI '05 Open Interaction Event, showing the potential for emotional modeling to improve human-robot interaction |
Year | DOI | Venue |
---|---|---|
2006 | 10.1109/IROS.2006.282327 | Beijing |
Keywords | Field | DocType |
face recognition,intelligent robots,mobile robots,robot vision,AAAI '05 Open Interaction Event,emotion-based decision mechanisms,face detection,human-robot interaction,internal emotional state,social mobile robot,face recognition,human-robot interaction,robot emotions | Facial recognition system,Computer vision,Social robot,Computer science,Intelligent robots,Psychological research,Artificial intelligence,Face detection,Robot,Mobile robot,Human–robot interaction | Conference |
ISBN | Citations | PageRank |
1-4244-0259-X | 10 | 0.99 |
References | Authors | |
4 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Geoffrey A. Hollinger | 1 | 334 | 27.61 |
Yavor Georgiev | 2 | 10 | 0.99 |
Anthony Manfredi | 3 | 10 | 0.99 |
Bruce A. Maxwell | 4 | 197 | 23.01 |
Zachary Pezzementi | 5 | 152 | 10.54 |
Benjamin Mitchell | 6 | 10 | 1.33 |