V. Mladenov et al. (Eds.): ICANN 2013, LNCS 8131, pp. 240–247, 2013.
© Springer-Verlag Berlin Heidelberg 2013
Cortically Inspired Sensor Fusion Network
for Mobile Robot Heading Estimation
Cristian Axenie and Jörg Conradt
Fachgebiet Neurowissenschaftliche Systemtheorie, Fakultät für Elektro- und
Informationstechnik, Technische Universität München, 80290 München, Germany
{cristian.axenie,conradt}@tum.de
Abstract. All physical systems must reliably extract information from their
noisily and partially observable environment, such as distances to objects.
Biology has developed reliable mechanisms to combine multi-modal sensory
information into a coherent belief about the underlying environment that caused
the percept; a process called sensor fusion. Autonomous technical systems
(such as mobile robots) employ compute-intense algorithms for sensor fusion,
which hardly work in real-time; yet their results in complex unprepared
environments are typically inferior to human performance. Despite the little we
know about cortical computing principles for sensor fusion, an obvious
difference between biological and technical information processing lies in the
way information flows: computer algorithms are typically designed as feed-
forward filter-banks, whereas in Cortex we see vastly recurrent connected
networks with intertwined information processing, storage, and exchange. In
this paper we model such information processing as distributed graphical
network, in which independent neural computing nodes obtain and represent
sensory information, while processing and exchanging exclusively local data.
Given various external sensory stimuli, the network relaxes into the best
possible explanation of the underlying cause, subject to the inferred reliability
of sensor signals. We implement a simple test-case scenario with a 4
dimensional sensor fusion task on an autonomous mobile robot and demonstrate
its performance. We expect to be able to expand this sensor fusion principle to
vastly more complex tasks.
Keywords: Cortical inspired sensor fusion, graphical network, local processing,
mobile robotics.
1 Introduction
Environmental perception enables a physical system to acquire and build an internal
representation of significant information within its environment. As an example of
such an internal state, accurate self-motion perception is an essential component for
spatial orientation, navigation and motor planning for both real and artificial systems.
A system can build its spatial knowledge using a combination of multiple sources of
information, conveyed from self-motion related signals (e.g. odometry or vestibular
signals), but also from static external environmental cues (e.g. visual or auditory) [1].