Abstract | ||
---|---|---|
Tactile displays have predominantly been used for information transfer using patterns or as assistive feedback for interactions. With recent advances hardware for conveying increasingly rich tactile information that mirrors visual information, and the increasing viability of wearables that remain in constant contact with the skin, there is a compelling argument for exploring tactile interactions as rich as visual/aural displays. We introduce dialogic manipulation - a form of direct manipulation for tactile displays that enables its use in a manner similar to visual displays for GUI like pointing, selecting and target manipulation interactions. We build a novel proof of concept system and demonstrate the feasibility of the concept and the usability of its implementation. We establish the significant effects of multiple parameters on target acquisition performance, including a new factor, position on the skin. We conclude with application scenarios and future research directions in this novel space. |
Year | DOI | Venue |
---|---|---|
2015 | 10.1145/2820619.2820640 | IHM |
Field | DocType | Citations |
Dialogic,Information transfer,Target acquisition,Computer science,Wearable computer,Usability,Human–computer interaction,Proof of concept,Multimedia | Conference | 0 |
PageRank | References | Authors |
0.34 | 17 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Thomas Pietrzak | 1 | 107 | 10.14 |
Nicolas Roussel | 2 | 220 | 16.22 |
Aakar Gupta | 3 | 122 | 8.32 |
Ravin Balakrishnan | 4 | 6497 | 403.55 |