Title
The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes?
Abstract
A successful grasp requires careful balancing of the contact forces. Deducing whether a particular grasp will be successful from indirect measurements, such as vision, is therefore quite challenging, and direct sensing of contacts through touch sensing provides an appealing avenue toward more successful and consistent robotic grasping. However, in order to fully evaluate the value of touch sensing for grasp outcome prediction, we must understand how touch sensing can influence outcome prediction accuracy when combined with other modalities. Doing so using conventional model-based techniques is exceptionally difficult. In this work, we investigate the question of whether touch sensing aids in predicting grasp outcomes within a multimodal sensing framework that combines vision and touch. To that end, we collected more than 9,000 grasping trials using a two-finger gripper equipped with GelSight high-resolution tactile sensors on each finger, and evaluated visuo-tactile deep neural network models to directly predict grasp outcomes from either modality individually, and from both modalities together. Our experimental results indicate that incorporating tactile readings substantially improve grasping performance.
Year
Venue
DocType
2017
CoRL
Journal
Volume
Citations 
PageRank 
abs/1710.05512
8
0.51
References 
Authors
12
7
Name
Order
Citations
PageRank
Roberto Calandra110513.42
Andrew Owens2745.13
Manu Upadhyaya380.51
Wenzhen Yuan4855.90
Justin Lin580.85
Edward H. Adelson61768320.52
Sergey Levine73377182.21