Title
Towards a Programming-Free Robotic System for Assembly Tasks Using Intuitive Interactions
Abstract
Although industrial robots are successfully deployed in many assembly processes, high-mix, low-volume applications are still difficult to automate, as they involve small batches of frequently changing parts. Setting up a robotic system for these tasks requires repeated reprogramming by expert users, incurring extra time and costs. In this paper, we present a solution which enables a robot to learn new objects and new tasks from non-expert users without the need for programming. The use case presented here is the assembly of a gearbox mechanism. In the proposed solution, first, the robot can autonomously register new objects using a visual exploration routine, and train a deep learning model for object detection accordingly. Secondly, the user can teach new tasks to the system via visual demonstration in a natural manner. Finally, using multimodal perception from RGB-D (color and depth) cameras and a tactile sensor, the robot can execute the taught tasks with adaptation to changing configurations. Depending on the task requirements, it can also activate human-robot collaboration capabilities. In summary, these three main modules enable any non-expert user to configure a robot for new applications in a fast and intuitive way.
Year
DOI
Venue
2021
10.1007/978-3-030-90525-5_18
SOCIAL ROBOTICS, ICSR 2021
Keywords
DocType
Volume
Robotic manipulation, Multimodal perception, Object and task teaching, Grasping and insertion, Human-robot collaboration
Conference
13086
ISSN
Citations 
PageRank 
0302-9743
0
0.34
References 
Authors
0
8
Name
Order
Citations
PageRank
Nicolas Gauthier102.03
wenyu liang201.01
Qianli Xu39015.17
Fen Fang402.03
Liyuan Li54813.24
Ruihan Gao600.34
Wu Yan7388.09
Joo-Hwee Lim878382.45