Title
Unigrasp: Learning A Unified Model To Grasp With Multifingered Robotic Hands
Abstract
To achieve a successful grasp, gripper attributes such as its geometry and kinematics play a role as important as the object geometry. The majority of previous work has focused on developing grasp methods that generalize over novel object geometry but are specific to a certain robot hand. We propose UniGrasp, an efficient data-driven grasp synthesis method that considers both the object geometry and gripper attributes as inputs. UniGrasp is based on a novel deep neural network architecture that selects sets of contact points from the input point cloud of the object. The proposed model is trained on a large dataset to produce contact points that are in force closure and reachable by the robot hand. By using contact points as output, we can transfer between a diverse set of multifingered robotic hands. Our model produces over 90% valid contact points in Top10 predictions in simulation and more than 90% successful grasps in real world experiments for various known two-fingered and three-fingered grippers. Our model also achieves 93%, 83% and 90% successful grasps in real world experiments for an unseen two-fingered gripper and two unseen multi-fingered anthropomorphic robotic hands.
Year
DOI
Venue
2020
10.1109/LRA.2020.2969946
IEEE ROBOTICS AND AUTOMATION LETTERS
Keywords
DocType
Volume
Deep learning in robotics and automation, grasping, multifingered hands
Journal
5
Issue
ISSN
Citations 
2
2377-3766
2
PageRank 
References 
Authors
0.37
0
9
Name
Order
Citations
PageRank
Lin Shao1123.33
Fábio Ferreira221.38
Mikael Jorda352.53
Varun Nambiar420.37
Jianlan Luo522.06
Eugen Solowjow6114.69
Juan Aparicio Ojea732.15
Oussama Khatib863761172.08
Jeannette Bohg927530.60