Dendritic normalisation improves learning in sparsely connected artificial neural networks

  • Inspired by the physiology of neuronal systems in the brain, artificial neural networks have become an invaluable tool for machine learning applications. However, their biological realism and theoretical tractability are limited, resulting in poorly understood parameters. We have recently shown that biological neuronal firing rates in response to distributed inputs are largely independent of size, meaning that neurons are typically responsive to the proportion, not the absolute number, of their inputs that are active. Here we introduce such a normalisation, where the strength of a neuron’s afferents is divided by their number, to various sparsely-connected artificial networks. The learning performance is dramatically increased, providing an improvement over other widely-used normalisations in sparse networks. The resulting machine learning tools are universally applicable and biologically inspired, rendering them better understood and more stable in our tests.

Download full text files

Export metadata

Additional Services

Share in Twitter Search Google Scholar
Author:Alexander D. BirdORCiD, Hermann CuntzORCiDGND
Parent Title (English):bioRxiv
Document Type:Preprint
Date of Publication (online):2020/05/28
Date of first Publication:2020/05/28
Publishing Institution:Universitätsbibliothek Johann Christian Senckenberg
Release Date:2023/03/25
Page Number:11
Institutes:Wissenschaftliche Zentren und koordinierte Programme / Frankfurt Institute for Advanced Studies (FIAS)
Dewey Decimal Classification:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Licence (German):License LogoCreative Commons - CC BY-NC-ND - Namensnennung - Nicht kommerziell - Keine Bearbeitungen 4.0 International