Abstract Rachael Stentiford
Insect visual navigation in natural scenes: Maintaining head direction estimates with a spiking neural network model of the central complex and active behavioural strategies
Maintaining a stable estimate of heading direction is an essential component of many behaviours across species. Despite having small brains (~106 neurons) and low-resolution eyes (~103 pixels), insects learn to navigate in highly dynamic environments extremely rapidly and robustly. Heading direction is tracked by ‘compass neurons’ in the central complex which show activity with strong directional tuning. This estimate of heading direction can be maintained with angular self-motion information (e.g. from efference copy and/or optic flow), but is subject to accumulating error making the estimate increasingly inaccurate. To mitigate this error, heading direction is additionally stabilised by external visual features. Specifically, the insect learns the relationships between visual information arriving from visual ring neurons and the compass neurons representing its current heading, such that when a visual scene is revisited the appropriate heading is recalled despite noisy self-motion cues.
Using a spiking neural network model of the insect central complex, we first show that simple visual filtering by ring neurons with Drosophila-inspired receptive fields is sufficient to form a mapping between heading and complex natural scenes. This mapping can maintain the heading estimate in the absence of self-motion information. However over extended periods, visual features will change and ring neuron to compass neuron mappings will become inaccurate. We then explore how using active behavioural strategies to resample headings can support the maintenance of accurate heading estimations.
Short Bio
Rachael Stentiford is a Research Fellow at the University of Sussex where she works on spiking neural network models of the Insect head direction system. She completed her PhD in Integrative Neuroscience at the University of Bristol, before a Postdoc at the Bristol Robotics Laboratory working on Mammalian models of navigation. With a background in both experimental and computational neuroscience methods, she is interested in interdisciplinary approaches to exploring visual navigation across species.