Abstract | ||
---|---|---|
Knowing who is in oneu0027s vicinity is key to managing privacy in everyday environments, but is challenging for people with visual impairments. Wearable cameras and other sensors may be able to detect such information, but how should this complex visually-derived information be conveyed in a way that is discreet, intuitive, and unobtrusive? Motivated by previous studies on the specific information that visually impaired people would like to have about their surroundings, we created three medium-fidelity prototypes: 1) a 3D printed model of a watch to convey tactile information; 2) a smartwatch app for haptic feedback; and 3) a smartphone app for audio feedback. A usability study with 14 participants with visual impairments identified a range of practical issues (e.g., speed of conveying information) and design considerations (e.g., configurable privacy bubble) for conveying privacy feedback in real-world contexts. |
Year | Venue | DocType |
---|---|---|
2019 | arXiv: Human-Computer Interaction | Journal |
Volume | Citations | PageRank |
abs/1904.06117 | 0 | 0.34 |
References | Authors | |
0 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Tousif Ahmed | 1 | 32 | 6.26 |
Rakibul Hasan | 2 | 12 | 2.92 |
Kay H. Connelly | 3 | 489 | 42.61 |
D. Crandall | 4 | 2111 | 168.58 |
Apu Kapadia | 5 | 1449 | 83.13 |