Title
Grip: Generative Robust Inference And Perception For Semantic Robot Manipulation In Adversarial Environments
Abstract
Recent advancements have led to a proliferation of machine learning systems used to assist humans in a wide range of tasks. However, we are still far from accurate, reliable, and resource-efficient operations of these systems. For robot perception, convolutional neural networks (CNNs) for object detection and pose estimation are recently coming into widespread use. However, neural networks are known to suffer from overfitting during the training process and are less robust under unforeseen conditions (which makes them especially vulnerable to adversarial scenarios). In this work, we propose Generative Robust Inference and Perception (GRIP) as a two-stage object detection and pose estimation system that aims to combine the relative strengths of discriminative CNNs and generative inference methods to achieve robust estimation. Our results show that a second stage of sample-based generative inference is able to recover from false object detections by CNNs, and produce robust estimations in adversarial conditions. We demonstrate the efficacy of GRIP robustness through comparison with state-of-the-art learning-based pose estimators and pick-and-place manipulation in dark and cluttered environments.
Year
DOI
Venue
2019
10.1109/IROS40897.2019.8967983
2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS)
Field
DocType
Volume
Computer vision,Object detection,Convolutional neural network,Computer science,Inference,Pose,Robustness (computer science),Artificial intelligence,Overfitting,Artificial neural network,Discriminative model,Machine learning
Conference
abs/1903.08352
ISSN
Citations 
PageRank 
2153-0858
0
0.34
References 
Authors
0
7
Name
Order
Citations
PageRank
Xiaotong Chen103.72
Rui Chen245.23
Zhiqiang Sui3144.66
Zhefan Ye400.68
Yanqi Liu502.37
Iris Bahar6223.30
R. Iris Bahar702.03