Title
Real-time human-robot interaction in complex environment using kinect v2 image recognition
Abstract
This paper presents real-time interaction between 7-DOF KUKA robotic arm and any untrained human operator using Kinect V2. Kinect sensor is utilized to detect human body joints and mono color object to be grasped, using HSL-XYZ algorithm. By moving hand holding simple object such as a ball, operator can make KUKA robot follow the expected trajectory and fulfill pass-and-place tasks. One hand of operator is followed by KUKA, while the pose of the other arm commands the gripper to move, grasp, release and place. Experiments proved our advantages that Human-Robot Interaction is more robust, easier and more intuitive for human, with lower requirements for sensor; and is a novel solution for industries and life. Client - server application using UDP protocol is performed to transmit and receive real-time control and feedback data.
Year
DOI
Venue
2015
10.1109/ICCIS.2015.7274606
2015 IEEE 7th International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM)
Keywords
Field
DocType
Human-robot interaction,Pick and place,Kinect v2,KUKA robot
Computer vision,Social robot,Robot control,Robotic arm,GRASP,Artificial intelligence,SMT placement equipment,Engineering,Robot,Human–robot interaction,Trajectory
Conference
ISSN
ISBN
Citations 
2326-8123
978-1-4673-7337-1
0
PageRank 
References 
Authors
0.34
3
4
Name
Order
Citations
PageRank
Yang, Y.1211.92
Haibin Yan21728.55
Masood Dehghan3497.11
Marcelo H. Ang477598.60