Title
Multi-Sensor Fusion Based Robot Self-Activity Recognition
Abstract
Robots play more and more important roles in our daily life. To better complete assigned tasks, it is necessary for the robots to have the ability to recognize their self-activities in real time. To perceive the environment, robots usually equipped with rich sensors, which can be used to recognize their self-activities. However, the intrinsics of the sensors such as accelerometer, servomotor and gyroscope may have significant differences, individual sensor usually exhibits weak performance in perceiving the environment. Therefore, multi-sensor fusion becomes a promising technique so that to achieve better performance. In this paper, facing the issue of robot self-activity recognition, we propose a framework to fuse information from multiple sensory streams. Our framework takes Recurrent Neural Network(RNN) that uses Long Short-Term Memory(LSTM) units to model temporal information conveyed in multiple sensory streams. In the architecture, a hierarchy structure is used to learn the sensor-specific features, a shared layer is used to fuse the features extracted from multiple sensory streams. We collect a dataset on PKU-HR6.0 robot to evaluate the proposed framework. The experiment results demonstrate the effectiveness of the proposed framework.
Year
DOI
Venue
2018
10.1109/HUMANOIDS.2018.8624918
2018 IEEE-RAS 18TH INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS)
Field
DocType
ISSN
Computer vision,Activity recognition,Accelerometer,Simulation,Computer science,Recurrent neural network,Sensor fusion,Artificial intelligence,Robot,Fuse (electrical),Intrinsics,Servomotor
Conference
2164-0572
Citations 
PageRank 
References 
0
0.34
0
Authors
4
Name
Order
Citations
PageRank
Dingsheng Luo14611.61
Yang Ma200.34
Xiangqi Zhang300.34
Xihong Wu427953.02