Abstract | ||
---|---|---|
One-shot imitation is the vision of robot programming from a single demonstration, rather than by tedious construction of computer code. We present a practical method for realizing one-shot imitation for manipulation tasks, exploiting modern learning-based optical flow to perform real-time visual servoing. Our approach, which we call FlowControl, continuously tracks a demonstration video, using a specified foreground mask to attend to an object of interest. Using RGB-D observations, FlowControl requires no 3D object models, and is easy to set up. FlowControl inherits great robustness to visual appearance from decades of work in optical flow. We exhibit FlowControl on a range of problems, including ones requiring very precise motions, and ones requiring the ability to generalize. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1109/IROS45743.2020.9340942 | IROS |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Max Argus | 1 | 12 | 3.16 |
Lukás Hermann | 2 | 0 | 1.35 |
Jon Long | 3 | 0 | 0.34 |
Thomas Brox | 4 | 7866 | 327.52 |