Title
A Camera–Radar Fusion Method Based on Edge Computing
Abstract
Multi-access edge computing provides a low-latency and high-performance network environment for the Internet of Vehicle services by migrating computing and storage resources to the edge of networks and supports the deployment of more regional-specific vehicle-to-everything services. In this paper, we propose a fusion perception method for roadside cameras and millimeter wave radar as an application of edge computing. First, we used YOLOv3 to process camera data and DBSCAN clustering method to process radar data to obtain the position, speed, and category of detections. Next, we used joint calibration to spatially synchronize camera and radar data and a direct update method to temporally synchronize camera and radar data. Furthermore, we used the Munkres algorithm to associate camera and radar detections, and Kalman filter to track the proposed fusion method's perception results. Finally, we conducted simulation experiments to evaluate the proposed method. The simulation results demonstrate the effectiveness of our algorithm.
Year
DOI
Venue
2020
10.1109/EDGE50951.2020.00009
2020 IEEE International Conference on Edge Computing (EDGE)
Keywords
DocType
ISBN
edge computing,sensor fusion,camera,radar,roadside perception
Conference
978-1-7281-8255-1
Citations 
PageRank 
References 
0
0.34
0
Authors
7
Name
Order
Citations
PageRank
Yanjin Fu100.34
Daxin Tian220432.49
Xunting Duan300.34
Jianshan Zhou400.34
Ping Lang5191.28
Chunmian Lin600.34
Xin You700.34