Title
Semantic-aware Active Perception for UAVs using Deep Reinforcement Learning
Abstract
This work presents a semantic-aware path-planning pipeline for Unmanned Aerial Vehicles (UAVs) using deep reinforcement learning for vision-based navigation in challenging environments. Driven by the maturity of works in semantic segmentation, the proposed path-planning architecture uses reinforcement learning to distinguish the parts of the scene that are perceptually more informative using semantic cues, in effect guiding more robust, repeatable, and accurate navigation of the UAV to the predefined goal destination. Assuming that the UAV performs vision-based state estimation, such as keyframe-based visual odometry, and semantic segmentation onboard, the proposed deep policy network continuously evaluates the optimal relative perceptual informativeness of each semantic class in view. A perception-aware path planner uses these informativeness values to perform trajectory optimization in order to generate the next best action with respect to the current state and the perception quality of the surroundings, essentially guiding the UAV to avoid flying over perceptually degraded regions. Thanks to the use of semantic cues, the policy can be trained in a large number of non-photorealistic randomly-generated scenes, and results to an architecture that is generalizable to environments with the same semantic classes, independently of their visual appearance. Extensive evaluations on challenging, photorealistic simulations reveal a remarkable improvement in robustness and success rate with the proposed approach over the state of the art in active perception. Video - https://youtu.be/RaO3whUBVnc
Year
DOI
Venue
2021
10.1109/IROS51168.2021.9635893
2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS)
DocType
ISSN
Citations 
Conference
2153-0858
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Luca Bartolomei100.34
Lucas Teixeira2306.93
Margarita Chli3128353.59