Title
Pick-Up Motion Based on Vision and Tactile Information in Hand/Arm Robot
Abstract
A multi-fingered robot hand receives much attention in various fields. We have developed the multi-fingered robot hand with the multi-axis force/torque sensors. For stable transportation, the robot hand must pick up an object without dropping it and places it without damaging it. This paper deals with a pick-up motion based on vision and tactile information by the eveloped robot hand. Here, the robot hand learns a posture for picking an object up by using tactile values and the visual image in advance, then determines the number of fingers in pick-up motion by the visual image. The effectiveness of the proposed grasp selection is verified through some experiments with the universal robot hand.
Year
DOI
Venue
2016
10.1109/CMCSN.2016.36
2016 Third International Conference on Computing Measurement Control and Sensor Network (CMCSN)
Keywords
Field
DocType
Robot Hand,Teleoperation,Visual Feedback,Tactile Feedback
Social robot,Computer vision,Robot control,Bang-bang robot,Computer science,Robot kinematics,Robot end effector,Artificial intelligence,Mobile robot navigation,Cartesian coordinate robot,Mobile robot
Conference
ISBN
Citations 
PageRank 
978-1-5090-1094-3
0
0.34
References 
Authors
3
4
Name
Order
Citations
PageRank
Futoshi Kobayashi13614.80
Shou Minoura200.34
Hiroyuki Nakamoto3138.01
Fumio Kojima47718.33