Title
Self-Supervised Learning of Visual Servoing for Low-Rigidity Robots Considering Temporal Body Changes
Abstract
In this study, we investigate object grasping by visual servoing in a low-rigidity robot. It is difficult for a low-rigidity robot to handle its own body as intended compared to a rigid robot, and calibration between vision and body takes some time. In addition, the robot must constantly adapt to changes in its body, such as the change in camera position and change in joints due to aging. Therefore, we develop a method for a low-rigidity robot to autonomously learn visual servoing of its body. We also develop a mechanism that can adaptively change its visual servoing according to temporal body changes. We apply our method to a low-rigidity 6-axis arm, MyCobot, and confirm its effectiveness by conducting object grasping experiments based on visual servoing.
Year
DOI
Venue
2022
10.1109/LRA.2022.3186074
IEEE ROBOTICS AND AUTOMATION LETTERS
Keywords
DocType
Volume
Learning from experience, learning from demonstration, visual servoing
Journal
7
Issue
ISSN
Citations 
3
2377-3766
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Kento Kawaharazuka100.68
Naoaki Kanazawa200.34
Kei Okada3534118.08
Masayuki Inaba42186410.27