Title
Object pose estimation and tracking by fusing visual and tactile information
Abstract
Robot grasping and manipulation require very accurate knowledge of the object's location within the robotic hand. By itself, a vision system cannot provide very precise and robust pose tracking due to occlusions or hardware limitations. This paper presents a method to estimate a grasped object's 6D pose by fusing sensor data from vision, tactile sensors and joint encoders. Given an initial pose acquired by the vision system and the contact locations on the fingertips, an iterative process optimises the estimation of the object pose by finding a transformation that fits the grasped object to the finger tips. Experiments were carried out in both simulation and a real system consisting of a Shadow arm and hand with ATI Force/Torque sensors instrumented on the fingertips and a Microsoft Kinect camera. In order to make the method suitable for real-time applications, the performance of the algorithm was investigated in terms of speed and accuracy of convergence.
Year
DOI
Venue
2012
10.1109/MFI.2012.6343019
MFI
Field
DocType
ISBN
Computer vision,Machine vision,Computer science,3D pose estimation,Robot kinematics,Sensor fusion,Pose,Video tracking,Artificial intelligence,Articulated body pose estimation,Tactile sensor
Conference
978-1-4673-2511-0
Citations 
PageRank 
References 
6
0.56
7
Authors
8
Name
Order
Citations
PageRank
Joao Bimbo160.56
Silvia Rodríguez-Jiménez260.56
Hongbin Liu3423.99
Xiaojing Song427418.40
Nicolas Burrus5243.11
Lakmal D. Seneviratne657770.91
Mohamed Abderrahim713512.27
Kaspar Althoefer8847112.87