Title
Multi-modal sensor fusion algorithm for ubiquitous infrastructure-free localization in vision-impaired environments.
Abstract
In this paper, we present a unified approach for a camera tracking system based on an error-state Kalman filter algorithm. The filter uses relative (local) measurements obtained from image based motion estimation through visual odometry, as well as global measurements produced by landmark matching through a pre-built visual landmark database and range measurements obtained from radio frequency (RF) ranging radios. We show our results by using the camera poses output by our system to render views from a 3D graphical model built upon the same coordinate frame as the landmark database which also forms the global coordinate system and compare them to the actual video images. These results help demonstrate both the long term stability and the overall accuracy of our algorithm as intended to provide a solution to the GPS denied ubiquitous camera tracking problem under both vision-aided and vision-impaired conditions.
Year
DOI
Venue
2010
10.1109/IROS.2010.5649562
IROS
Keywords
Field
DocType
Kalman filters,distance measurement,motion estimation,radio tracking,radionavigation,rendering (computer graphics),sensor fusion,stereo image processing,3D graphical model,camera tracking system,error state Kalman filter,global coordinate system,motion estimation,multimodal sensor fusion,range measurement,ubiquitous camera tracking,vision impaired environment,visual landmark database,visual odometry
Computer vision,Visual odometry,Computer science,Algorithm,Sensor fusion,Kalman filter,Ranging,Global Positioning System,Artificial intelligence,Motion estimation,Landmark,Radio navigation
Conference
ISSN
Citations 
PageRank 
2153-0858
3
0.39
References 
Authors
7
5
Name
Order
Citations
PageRank
Taragay Oskiper1858.60
Han-Pang Chiu29410.83
Zhiwei Zhu369455.60
Supun Samarasekera479285.72
Rakesh Kumar51923157.44