Title
Learning Grasp Affordance Reasoning Through Semantic Relations
Abstract
Reasoning about object affordances allows an autonomous agent to perform generalised manipulation tasks among object instances. While current approaches to grasp affordance estimation are effective, they are limited to a single hypothesis. We present an approach for detection and extraction of multiple grasp affordances on an object via visual input. We define semantics as a combination of multiple attributes, which yields benefits in terms of generalisation for grasp affordance prediction. We use Markov Logic Networks to build a knowledge base graph representation to obtain a probability distribution of grasp affordances for an object. To harvest the knowledge base, we collect and make available a novel dataset that relates different semantic attributes. We achieve reliable mappings of the predicted grasp affordances on the object by learning prototypical grasping patches from several examples. We show our method & x0027;s generalisation capabilities on grasp affordance prediction for novel instances and compare with similar methods in the literature. Moreover, using a robotic platform, on simulated and real scenarios, we evaluate the success of the grasping task when conditioned on the grasp affordance prediction.
Year
DOI
Venue
2019
10.1109/LRA.2019.2933815
IEEE ROBOTICS AND AUTOMATION LETTERS
Keywords
DocType
Volume
Grasping, Task analysis, Knowledge based systems, Robots, Semantics, Visualization, Shape, Grasping, humanoid robots, knowledge base systems
Journal
4
Issue
ISSN
Citations 
4
2377-3766
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Paola Ardón101.35
Èric Pairet211.70
Ronald P. A. Petrick330924.24
Subramanian Ramamoorthy422945.45
Katrin S. Lohan55114.42