Abstract | ||
---|---|---|
The objective is to develop a mobile human-robot interface that is optimized for multi-touch input. Our existing interface was designed for mouse and keyboard input and was later adopted for voice and touch interaction. A new multi-touch interface permits multi-touch gestures, for example zooming and panning a map, and robot task specific touch interactions. An initial user evaluation found that the multi-touch interface is preferred and yields superior performance. |
Year | DOI | Venue |
---|---|---|
2010 | 10.1109/HRI.2010.5453253 | Human-Robot Interaction |
Keywords | Field | DocType |
keyboard input,tasking robot,multi-touch interaction,touch interaction,example zooming,new multi-touch interface,multi-touch interface,multi-touch gesture,mobile human-robot interface,existing interface,robot task specific touch,multi-touch input,mobile computing,mobile robots,human robot interaction,shape,displays,mobile communication,human robot interface | Computer science,Gesture,Human–computer interaction,Artificial intelligence,Multi-touch,10-foot user interface,Human–robot interaction,Computer vision,Simulation,Zoom,Robot,Natural user interface,Mobile robot | Conference |
ISSN | ISBN | Citations |
2167-2121 | 978-1-4244-4893-7 | 7 |
PageRank | References | Authors |
0.51 | 3 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sean Timothy Hayes | 1 | 8 | 0.87 |
Eli R. Hooten | 2 | 10 | 0.97 |
Julie A. Adams | 3 | 392 | 53.75 |