Title
Dual arm estimation for coordinated bimanual manipulation
Abstract
This paper develops an estimation framework for sensor-guided dual-arm manipulation of a rigid object. Using an unscented Kalman Filter (UKF), the approach combines both visual and kinesthetic information to track both the manipulators and object. From visual updates of the object and manipulators, and tactile updates, the method estimates both the robot's internal state and the object's pose. Nonlinear constraints are incorporated into the framework to deal with the an additional arm and ensure the state is consistent. Two frameworks are compared in which the first framework run two single arm filters in parallel and the second consists of the augment dual arm filter with nonlinear constraints. Experiments on a wheel changing task are demonstrated using the DARPA ARM-S system, consisting of dual Barrett- WAM manipulators.
Year
DOI
Venue
2013
10.1109/ICRA.2013.6630565
Robotics and Automation
Keywords
Field
DocType
Kalman filters,dexterous manipulators,estimation theory,nonlinear filters,object tracking,pose estimation,state estimation,DARPA ARM-S system,UKF,augment dual arm filter,coordinated bimanual manipulation,dual Barrett- WAM manipulators,dual arm estimation,estimation framework,kinesthetic information,nonlinear constraints,object pose,rigid object,robot internal state,sensor-guided dual-arm manipulation,single arm filters,tactile updates,unscented Kalman filter,visual information,wheel changing task
Kinesthetic learning,Computer vision,Nonlinear system,Control theory,Pose,Control engineering,Kalman filter,Video tracking,Artificial intelligence,Estimation theory,Engineering,Robot
Conference
Volume
Issue
ISSN
2013
1
1050-4729
ISBN
Citations 
PageRank 
978-1-4673-5641-1
5
0.49
References 
Authors
0
4
Name
Order
Citations
PageRank
Paul Hebert1332.36
Nicolas Hudson2755.06
Jeremy Ma31819.93
Burdick, J.W.42988516.87