Title
Learning To Predict Ego-Vehicle Poses For Sampling-Based Nonholonomic Motion Planning
Abstract
Sampling-based motion planning is an effective tool to compute safe trajectories for automated vehicles in complex environments. However, a fast convergence to the optimal solution can only be ensured with the use of problem-specific sampling distributions. Due to the large variety of driving situations within the context of automated driving, it is very challenging to manually design such distributions. This letter introduces, therefore, a data-driven approach utilizing a deep convolutional neural network (CNN): Given the current driving situation, future ego-vehicle poses can be directly generated from the output of the CNN allowing to guide the motion planner efficiently toward the optimal solution. A benchmark highlights that the CNN predicts future vehicle poses with a higher accuracy compared to uniform sampling and a state-of-the-art A*-based approach. Combining this CNN-guided sampling with the motion planner bidirectional RRT* reduces the computation time by up to an order of magnitude and yields a faster convergence to a lower cost as well as a success rate of 100% in the tested scenarios.
Year
DOI
Venue
2019
10.1109/LRA.2019.2893975
IEEE ROBOTICS AND AUTOMATION LETTERS
Keywords
DocType
Volume
Deep learning in robotics and automation, non-holonomic motion planning, motion and path planning
Journal
4
Issue
ISSN
Citations 
2
2377-3766
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Holger Banzhaf111.42
Paul Sanzenbacher200.34
Ulrich Baumann300.34
Johann Marius Zöllner413124.29