Abstract | ||
---|---|---|
Like today's autonomous vehicle prototypes, vehicles in the future will have rich sensors to map and identify objects in the environment. For example, many autonomous vehicle prototypes today come with line-of-sight depth perception sensors like 3D cameras. These cameras are used for improving vehicular safety in autonomous driving, but have fundamentally limited visibility due to occlusions, sensing range, and extreme weather and lighting conditions. To improve visibility and performance, not just for autonomous vehicles but for other Advanced Driving Assistance Systems (ADAS), we explore a capability called Augmented Vehicular Reality (AVR). AVR broadens the vehicle's visual horizon by enabling it to share visual information with other nearby vehicles, but requires careful techniques to align coordinate frames of reference, and to detect dynamic objects. Preliminary evaluations hint at the feasibility of AVR and also highlight research challenges in achieving AVR's potential to improve autonomous vehicles and ADAS. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1145/3032970.3032976 | HotMobile |
Keywords | Field | DocType |
Autonomous Cars, ADAS, Collaborative Sensing, Extended Vision | Computer vision,Visibility,Computer science,Extreme weather,Real-time computing,Artificial intelligence,Depth perception,Frame of reference | Conference |
Citations | PageRank | References |
11 | 0.62 | 19 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hang Qiu | 1 | 50 | 5.16 |
fawad ahmad | 2 | 32 | 4.42 |
ramesh govindan | 3 | 15430 | 2144.86 |
Marco Gruteser | 4 | 4631 | 309.81 |
Fan Bai | 5 | 2017 | 135.11 |
Gorkem Kar | 6 | 63 | 3.82 |