Title
Humanoids learn touch modalities identification via multi-modal robotic skin and robust tactile descriptors
Abstract
In this paper, we present a novel approach for touch modality identification via tactile sensing on a humanoid. In this respect, we equipped a NAO humanoid with whole upper body coverage of multi-modal artificial skin. We propose a set of biologically inspired feature descriptors to provide robust and abstract tactile information for use in touch classification. These features are demonstrated to be invariant to location of contact and movement of the humanoid, as well as capable of processing single and multi-touch actions. To provide a comparison of our method, existing approaches were reimplemented and evaluated. The experimental results show that the humanoid can distinguish different single touch modalities with a recognition rate of 96.79% while using the proposed feature descriptors and SVM classifier. Furthermore, it can recognize multiple touch actions with 93.03% recognition rate.
Year
DOI
Venue
2015
10.1080/01691864.2015.1095652
ADVANCED ROBOTICS
Keywords
Field
DocType
tactile data processing,tactile feature descriptors,tactile learning,touch classification,artificial robotic skin,humanoid robots
Kinesthetic learning,Modalities,Computer vision,Invariant (mathematics),Artificial intelligence,Svm classifier,Engineering,Modal,Humanoid robot
Journal
Volume
Issue
ISSN
29
SP21
0169-1864
Citations 
PageRank 
References 
15
0.78
14
Authors
3
Name
Order
Citations
PageRank
Mohsen Kaboli1766.09
Alex Long2150.78
Gordon Cheng31250115.33