Title
Robust top-down and bottom-up visual saliency for mobile robots using bio-inspired design principles
Abstract
Modern camera systems in robotics tend to produce overwhelming amounts of visual information due to their high resolutions and high frame rates. This raises a fundamental question of how robots should focus attention on a region of the visual scene, and how they should process information in the periphery. This is particularly an issue for mobile robots, where the computational resources of low-power embedded computing boards tend to be much less than for workstations. In this paper, we look to biological design in the primate brain for inspiration on how to solve this problem. We develop a novel computational fusion of bottom-up and top-down visual saliency information. The bottom-up saliency is produced using standard colour, intensity, and motion image processing methods. The top-down saliency is produced using a deep convolutional neural network for object detection and recognition, with foveated images for computational efficiency. Regions of attention are obtained using a computational model of the basal ganglia, thought to be involved in optimal decision making. The model of the basal ganglia is based on the multi-hypothesis sequential probability ratio test (MSPRT). The visual saliency scheme is evaluated on omnidirectional video feed highlighting a proximity to human behaviour.
Year
DOI
Venue
2021
10.1109/IROS51168.2021.9636800
2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS)
DocType
ISSN
Citations 
Conference
2153-0858
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Uziel Jaramillo-Avila100.34
Jonathan M. Aitken2266.92
Kevin N. Gurney344553.49
Sean R. Anderson400.68