Title
TouchPosing: multi-modal interaction with geospatial data
Abstract
Multi-touch interaction offers opportunities to interact with complex data. Especially the exploration of geographical data, which until today mostly relies on mice and keyboard input, could benefit from this interaction paradigm. However, the gestures that are required to interact with complex systems like Geographic Information Systems (GIS) increase in difficulty with every additional functionality. This paper describes a novel interaction approach that allows non-expert users to easily explore geographic data using a combination of multi-touch gestures and handpostures. The use of the additional input modality -- handpose -- is supposed to avoid more complex multi-touch gestures. Furthermore the screen of a wearable device serves as another output modality that on one hand avoids occlusion and on the other hand serves as a magic lens.
Year
DOI
Venue
2012
10.1145/2406367.2406377
MUM
Keywords
Field
DocType
additional input modality,geographical data,complex multi-touch gesture,multi-touch interaction,complex system,complex data,geospatial data,novel interaction approach,additional functionality,multi-modal interaction,interaction paradigm,geographic data,geographic information system,mobile devices
Geospatial analysis,Complex system,Geographic information system,Computer science,Wearable computer,Gesture,Complex data type,Human–computer interaction,Mobile device,Multimedia,Modal
Conference
Citations 
PageRank 
References 
1
0.36
15
Authors
4
Name
Order
Citations
PageRank
Florian Daiber115125.22
Sven Gehring229926.04
Markus Löchtefeld344736.12
Antonio Krüger41537127.04