Title
A hybrid architecture for the sensorimotor exploration of spatial scenes
Abstract
Humans are very efficient in the analysis, exploration and representation of their environment. Based on the neurobiological and cognitive principles of human information processing, we develop a system for the automatic identification and exploration of spatial configurations. The system sequentially selects "informative" regions (regions of interest), identifies the local structure, and uses this information for drawing efficient conclusions about the current scene. The selection process involves low-level, bottom-up processes for sensory feature extraction, and cognitive top-down processes for the generation of active motor commands that control the positioning of the sensors towards the most informative regions. Both processing levels have to deal with uncertain data, and have to take into account previous knowledge from statistical properties and learning. We suggest that this can be achieved in a hybrid architecture which integrates a nonlinear filtering stage modelled after the neural computations performed in the early stages of the visual system, and a cognitive reasoning strategy that operates in an adaptive fashion on a belief distribution.
Year
DOI
Venue
2005
10.1007/11676935_40
WILF
Keywords
Field
DocType
cognitive reasoning strategy,spatial scene,sensorimotor exploration,cognitive principle,hybrid architecture,human information processing,cognitive top-down process,informative region,visual system,system sequentially,processing level,efficient conclusion,account previous knowledge,feature extraction,nonlinear filter,region of interest,top down processing,bottom up
Architecture,Information processing,Computer science,Local structure,Uncertain data,Feature extraction,Artificial intelligence,Cognition,Sensory system,Machine learning,Computation
Conference
Volume
ISSN
ISBN
3849
0302-9743
3-540-32529-8
Citations 
PageRank 
References 
1
0.38
8
Authors
3
Name
Order
Citations
PageRank
Kerstin Schill118325.15
C. Zetzsche218126.79
Thusitha Parakrama310.38