The error-bounded descriptional complexity of approximation networks

  • It is well known that artificial neural nets can be used as approximators of any continuous functions to any desired degree and therefore be used e.g. in high - speed, real-time process control. Nevertheless, for a given application and a given network architecture the non-trivial task remains to determine the necessary number of neurons and the necessary accuracy (number of bits) per weight for a satisfactory operation which are critical issues in VLSI and computer implementations of nontrivial tasks. In this paper the accuracy of the weights and the number of neurons are seen as general system parameters which determine the maximal approximation error by the absolute amount and the relative distribution of information contained in the network. We define as the error-bounded network descriptional complexity the minimal number of bits for a class of approximation networks which show a certain approximation error and achieve the conditions for this goal by the new principle of optimal information distribution. For two examples, a simple linear approximation of a non-linear, quadratic function and a non-linear approximation of the inverse kinematic transformation used in robot manipulator control, the principle of optimal information distribution gives the the optimal number of neurons and the resolutions of the variables, i.e. the minimal amount of storage for the neural net. Keywords: Kolmogorov complexity, e-Entropy, rate-distortion theory, approximation networks, information distribution, weight resolutions, Kohonen mapping, robot control.

Download full text files

Export metadata

Metadaten
Author:Rüdiger W. BrauseGND
URN:urn:nbn:de:hebis:30-79022
ISSN:1879-2782
ISSN:0893-6080
Document Type:Article
Language:English
Date of Publication (online):2010/09/08
Year of first Publication:1993
Publishing Institution:Universitätsbibliothek Johann Christian Senckenberg
Release Date:2010/09/08
Tag:Kohonen mapping; Kolmogorov complexity; approximation networks; e-Entropy; information distribution; rate-distortion theory; weight resolutions
Page Number:33
First Page:2
Last Page:29
Note:
zuerst in: Neural Networks, 6.1993, S. 177-187
Source:Neural Networks, 6, S. 177-187
HeBIS-PPN:227571827
Institutes:Informatik und Mathematik / Informatik
Dewey Decimal Classification:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 000 Informatik, Informationswissenschaft, allgemeine Werke
Sammlungen:Universitätspublikationen
Licence (German):License LogoDeutsches Urheberrecht