Title
Graph Neural Networks for Interpretable Tactile Sensing
Abstract
Fine-grained tactile perception of objects is significant for robots to explore the unstructured environment. Recent years have seen the success of Convolutional Neural Networks (CNNs)-based methods for tactile perception using high-resolution optical tactile sensors. However, CNNs-based approaches may not be efficient for processing tactile image data and have limited interpretability. To this end, we propose a Graph Neural Network (GNN)-based approach for tactile recognition using a soft biomimetic optical tactile sensor. The obtained tactile images can be transformed into graphs, while GNN can be used to analyse the implicit tactile information among the tactile graphs. The experimental results indicate that with the proposed GNN-based method, the maximum tactile recognition accuracy can reach 99.53%. In addition, Gradient-weighted Class Activation Mapping (Grad-CAM) and Unsigned Grad-CAM (UGrad-CAM) methods are used for visual explanations of the models. Compared to traditional CNNs, we demonstrated that the generated features of the GNN-based model are more intuitive and interpretable.
Year
DOI
Venue
2022
10.1109/ICAC55051.2022.9911130
2022 27th International Conference on Automation and Computing (ICAC)
Keywords
DocType
ISBN
Tactile Sensor,Object Recognition,Graph Convolutional Network,Explainability
Conference
978-1-6654-9808-1
Citations 
PageRank 
References 
0
0.34
6
Authors
7
Name
Order
Citations
PageRank
Wen Fan100.34
Hongbo Bo200.34
Yijiong Lin300.34
Yifan Xing400.34
Weiru Liu51597112.05
Nathan Lepora600.34
Dandan Zhang700.34