Title | ||
---|---|---|
Object recognition for robotics from tactile time series data utilising different neural network architectures |
Abstract | ||
---|---|---|
Robots need to exploit high-quality information on grasped objects to interact with the physical environment. Haptic data can therefore be used for supplementing the visual modality. This paper investigates the use of Convolutional Neural Networks (CNN) and Long-Short Term Memory (LSTM) neural network architectures for object classification on Spatio-temporal tactile grasping data. Furthermore, we compared these methods using data from two different fingertip sensors (namely the BioTac SP and WTS-FT) in the same physical setup, allowing for a realistic comparison across methods and sensors for the same tactile object classification dataset. Additionally, we propose a way to create more training examples from the recorded data. The results show that the proposed method improves the maximum accuracy from 82.4 % (BioTac SP fingertips) and 90.7 % (WTS-FT fingertips) with complete time-series data to about 94 % for both sensor types. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/IJCNN52387.2021.9533388 | 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) |
Keywords | DocType | ISSN |
3D-CNN, CNN, LSTM, tactile sensing, object classification, BioTac, WTS-FT | Conference | 2161-4393 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wolfgang Bottcher | 1 | 0 | 0.34 |
Pedro Machado | 2 | 11 | 4.61 |
Nikesh Lama | 3 | 0 | 0.34 |
T. Martin Mcginnity | 4 | 518 | 66.30 |