Abstract | ||
---|---|---|
To achieve a successful grasp, gripper attributes such as its geometry and kinematics play a role as important as the object geometry. The majority of previous work has focused on developing grasp methods that generalize over novel object geometry but are specific to a certain robot hand. We propose UniGrasp, an efficient data-driven grasp synthesis method that considers both the object geometry and gripper attributes as inputs. UniGrasp is based on a novel deep neural network architecture that selects sets of contact points from the input point cloud of the object. The proposed model is trained on a large dataset to produce contact points that are in force closure and reachable by the robot hand. By using contact points as output, we can transfer between a diverse set of multifingered robotic hands. Our model produces over 90% valid contact points in Top10 predictions in simulation and more than 90% successful grasps in real world experiments for various known two-fingered and three-fingered grippers. Our model also achieves 93%, 83% and 90% successful grasps in real world experiments for an unseen two-fingered gripper and two unseen multi-fingered anthropomorphic robotic hands. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1109/LRA.2020.2969946 | IEEE ROBOTICS AND AUTOMATION LETTERS |
Keywords | DocType | Volume |
Deep learning in robotics and automation, grasping, multifingered hands | Journal | 5 |
Issue | ISSN | Citations |
2 | 2377-3766 | 2 |
PageRank | References | Authors |
0.37 | 0 | 9 |
Name | Order | Citations | PageRank |
---|---|---|---|
Lin Shao | 1 | 12 | 3.33 |
Fábio Ferreira | 2 | 2 | 1.38 |
Mikael Jorda | 3 | 5 | 2.53 |
Varun Nambiar | 4 | 2 | 0.37 |
Jianlan Luo | 5 | 2 | 2.06 |
Eugen Solowjow | 6 | 11 | 4.69 |
Juan Aparicio Ojea | 7 | 3 | 2.15 |
Oussama Khatib | 8 | 6376 | 1172.08 |
Jeannette Bohg | 9 | 275 | 30.60 |