Abstract | ||
---|---|---|
Task-relevant grasping is critical for industrial assembly, where downstream manipulation tasks constrain the set of valid grasps. Learning how to perform this task, however, is challenging, since task-relevant grasp labels are hard to define and annotate. There is also yet no consensus on proper representations for modeling or off-the-shelf tools for performing task-relevant grasps. This work proposes a framework to learn task-relevant grasping for industrial objects without the need of time-consuming real-world data collection or manual annotation. To achieve this, the entire framework is trained solely in simulation, including supervised training with synthetic label generation and self-supervised, hand-object interaction. In the context of this framework, this paper proposes a novel, object-centric canonical representation at the category level, which allows establishing dense correspondence across object instances and transferring task-relevant grasps to novel instances. Extensive experiments on task-relevant grasping of densely-cluttered industrial objects are conducted in both simulation and real-world setups, demonstrating the effectiveness of the proposed framework. Code and data are available at https://sites.google.com/view/catgrasp. |
Year | DOI | Venue |
---|---|---|
2022 | 10.1109/ICRA46639.2022.9811568 | IEEE International Conference on Robotics and Automation |
DocType | Volume | Issue |
Conference | 2022 | 1 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wen Bowen | 1 | 2 | 2.05 |
Wenzhao Lian | 2 | 0 | 0.34 |
Kostas E. Bekris | 3 | 938 | 99.49 |
Stefan Schaal | 4 | 6081 | 530.10 |