Title
Disparity estimation based on fusion of vision and LiDAR
Abstract
Advanced driver assistance systems improve driving safety and comfort by applying onboard sensors to collect environmental data, analyze environmental data and decision making. Therefore, advanced driver assistance systems have high requirements for distance perception of the environment. Perceptual sensors commonly used in traditional solutions include stereo vision sensors and the Light Detection and Ranging (LiDAR) sensors. This paper proposes a multi-sensing sensor fusion method for disparity estimation, which combines the perceptual data density characteristics of stereo vision sensors and the measurement accuracy characteristics of LiDAR sensors. The method enhances the sensing accuracy by ensuring high-density sense, which is suitable for distance sensing tasks in complex environments. This paper demonstrates with experimental results on real data that our proposed disparity estimation method performs well and is robust in different scenarios.
Year
DOI
Venue
2022
10.1142/S021969132250014X
INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING
Keywords
DocType
Volume
Stereo vision, LiDAR, disparity map
Journal
20
Issue
ISSN
Citations 
05
0219-6913
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Chao Ma18527.49
Pei Shanshan200.34
Sun Guoliang300.34
Meng Ran400.34
Luo Kun500.34