Title
Multi-sensor fusion for interactive visual computing in mixed environment
Abstract
Mobile Augmented Reality, as an emerging application for handheld devices, explores more natural interactions in real and virtual environments. For the purpose of accurate system response and manipulating objects in real-time, extensive efforts have been made to estimate six Degree-of-Freedom and extract robust feature to track. However there are still quite a lot challenges today in achieving rich user experience. To allow for a seamless transition from outdoor to indoor service, we investigated and integrated various sensing techniques of GPS, wireless, Inertial Measurement Units, and optical. A parallel tracking and matching scheme is presented to address the speed-accuracy tradeoff issue. Two prototypes, fine-scale mirror world navigation and context-aware trouble shooting, have been developed to demonstrate the suitability of our approach.
Year
DOI
Venue
2010
10.1145/1873951.1874264
ACM Multimedia 2001
Keywords
Field
DocType
inertial measurement units,context-aware trouble shooting,mobile augmented reality,multi-sensor fusion,mixed environment,interactive visual computing,handheld device,fine-scale mirror world navigation,accurate system response,extensive effort,indoor service,matching scheme,lot challenge,tracking,real time,degree of freedom,interactive visualization,sensor fusion,virtual environment,user experience,inertial measurement unit
Computer vision,Visual computing,User experience design,Units of measurement,Wireless,Computer science,Sensor fusion,Augmented reality,Mobile device,Artificial intelligence,Global Positioning System,Multimedia
Conference
Citations 
PageRank 
References 
0
0.34
2
Authors
8
Name
Order
Citations
PageRank
Peng Patricia Wang100.34
Tao Wang223823.70
Dayong Ding3527.11
Yimin Zhang435928.66
Kai Miao5374.44
Cynthia Pickering612811.26
Phil Tian781.10
Jinxue Zhang81389.25