Title
Multimodal reference resolution for mobile spatial interaction in urban environments
Abstract
We present results of a study on referring to the outside environment from within a moving vehicle. Reference resolution is the first necessary step in integrating the outside environment into the interactive system in the car. It is the problem of finding out which of the objects outside the users is interested in. In our study, we explored eye gaze, head pose, pointing gesture with a smart phone, and the user's view field. We implemented and tested everything in a moving vehicle in a real-life traffic. For safety reasons, the front-seat passenger used the system while the driver was concentrating completely on driving. For analysis and visualization of the user's interaction with the environment, 528 buildings of the city were modeled in 2.5D by using an airborne LIDAR scan, Google Earth, and a spatial database. As a result of our study, we propose in this paper a new algorithm for spatial reference resolution together with a scanning mechanism.
Year
DOI
Venue
2012
10.1145/2390256.2390296
AutomotiveUI
Keywords
Field
DocType
front-seat passenger,spatial database,new algorithm,reference resolution,multimodal reference resolution,urban environment,necessary step,spatial reference resolution,google earth,outside environment,interactive system,airborne lidar,mobile spatial interaction,eye gaze,automotive
Gesture,Human–computer interaction,Eye tracking,Lidar,Artificial intelligence,Spatial database,Computer vision,Moving vehicle,Visualization,Simulation,Spatial interaction,Engineering,Automotive industry
Conference
Citations 
PageRank 
References 
2
0.51
13
Authors
2
Name
Order
Citations
PageRank
Mohammad Mehdi Moniri1317.31
Christian Müller2484.50