TY - JOUR
A1 - Echeveste, Rodrigo
A1 - Eckmann, Samuel
A1 - Gros, Claudius
T1 - The Fisher information as a neural guiding principle for independent component analysis
T2 - Entropy
N2 - The Fisher information constitutes a natural measure for the sensitivity of a probability distribution with respect to a set of parameters. An implementation of the stationarity principle for synaptic learning in terms of the Fisher information results in a Hebbian self-limiting learning rule for synaptic plasticity. In the present work, we study the dependence of the solutions to this rule in terms of the moments of the input probability distribution and find a preference for non-Gaussian directions, making it a suitable candidate for independent component analysis (ICA). We confirm in a numerical experiment that a neuron trained under these rules is able to find the independent components in the non-linear bars problem. The specific form of the plasticity rule depends on the transfer function used, becoming a simple cubic polynomial of the membrane potential for the case of the rescaled error function. The cubic learning rule is also an excellent approximation for other transfer functions, as the standard sigmoidal, and can be used to show analytically that the proposed plasticity rules are selective for directions in the space of presynaptic neural activities characterized by a negative excess kurtosis.
KW - Fisher information
KW - guiding principle
KW - excess kurtosis
KW - objective functions
KW - synaptic plasticity
KW - Hebbian learning
KW - independent component analysis
Y1 - 2015
UR - http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/40266
UR - https://nbn-resolving.org/urn:nbn:de:hebis:30:3-402660
SN - 1099-4300
N1 - c 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
IS - 6
SP - 3838
EP - 3856
PB - MDPI
CY - Basel
ER -