Abstract | ||
---|---|---|
In this paper, we describe the assistive system ADAMAAS (Adaptive and Mobile Action Assistance) introducing a new advanced smartglasses technology. The aim of ADAMAAS is to move from stationary status diagnosis systems to a mobile and adaptive action support and monitoring system, which is able to dynamically react in a context-sensitive way to human error (slips and mistakes) and to provide individualized feedback on a transparent virtual plane superimposed on user's field of view. For this purpose ADAMAAS uses advanced technologies like augmented reality (AR), eye tracking, object recognition, and systematic analysis of users' mental representations in long-term memory. Preliminary user tests with disabled participants at an early prototype stage revealed no substantial physical restrictions in the execution of their activities, positive feedback regarding the assistive hints, and that participants could imagine wearing the glasses for long periods of time. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1145/2910674.2910727 | PETRA |
Field | DocType | Citations |
Field of view,Monitoring system,Computer science,Simulation,Human error,Augmented reality,Eye tracking,Human–computer interaction,Multimedia,Mental representation,Cognitive neuroscience of visual object recognition | Conference | 1 |
PageRank | References | Authors |
0.43 | 6 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
KAI ESSIG | 1 | 33 | 4.49 |
Benjamin Strenge | 2 | 1 | 0.43 |
Thomas Schack | 3 | 33 | 7.51 |