Title
Deep visual perception for dynamic walking on discrete terrain
Abstract
Dynamic bipedal walking on discrete terrain, like stepping stones, is a challenging problem requiring feedback controllers to enforce safety-critical constraints. To enforce such constraints in real-world experiments, fast and accurate perception for foothold detection and estimation is needed. In this work, a deep visual perception model is designed to accurately estimate step length of the next step, which serves as input to the feedback controller to enable vision-in-the-loop dynamic walking on discrete terrain. In particular, a custom convolutional neural network architecture is designed and trained to predict step length to the next foothold using a sampled image preview of the upcoming terrain at foot impact. The visual input is offered only at the beginning of each step and is shown to be sufficient for the job of dynamically stepping onto discrete footholds. Through extensive numerical studies, we show that the robot is able to successfully autonomously walk for over 100 steps without failure on a discrete terrain with footholds randomly positioned within a step length range of [45 : 85] centimeters.
Year
DOI
Venue
2017
10.1109/HUMANOIDS.2017.8246907
2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids)
Keywords
DocType
Volume
custom convolutional neural network architecture,visual input,discrete footholds,discrete terrain,step length range,dynamic bipedal walking,feedback controller,safety-critical constraints,foothold detection,deep visual perception model,vision-in-the-loop dynamic walking,sampled image preview
Journal
abs/1712.00916
ISBN
Citations 
PageRank 
978-1-5386-4679-3
2
0.40
References 
Authors
5
4
Name
Order
Citations
PageRank
Avinash Siravuru161.80
Allan Wang240.76
Quan T. Nguyen3275.43
Koushil Sreenath435833.41