Abstract Julian Göltz
From biology to silicon substrates: neural computation with physics
Whether biological or artificial, in our general understanding the degree of intelligence attributed to a system ultimately depends on its ability to perform complex computations efficiently. Designing systems capable of this requires overcoming a set of interrelated and interdisciplinary challenges. Setting out from descriptions of the biological mechanisms thought to underlay cortical computation, we aim to derive models capable of efficient coding, computation and learning. Fascinatingly, we find that some physical limitations thought to hinder computation can be leveraged to overcome others. In this spirit, I will present how some of our neuro-inspired models implement gradient-free learning, utilise noise in a functional manner, and learn with precise spike timings. For networks using this latter coding, I delve into their information propagation showing the similarity to established neuroscientific concepts, and describe implementations including trained, functional transmission delays on neuromorphic hardware.