TY - UNPD A1 - Brause, Rüdiger W. T1 - Approximator networks and the principle of optimal information distribution T2 - Universität Frankfurt am Main. Fachbereich Informatik: Interner Bericht ; 91,1 N2 - It is well known that artificial neural nets can be used as approximators of any continous functions to any desired degree. Nevertheless, for a given application and a given network architecture the non-trivial task rests to determine the necessary number of neurons and the necessary accuracy (number of bits) per weight for a satisfactory operation. In this paper the problem is treated by an information theoretic approach. The values for the weights and thresholds in the approximator network are determined analytically. Furthermore, the accuracy of the weights and the number of neurons are seen as general system parameters which determine the the maximal output information (i.e. the approximation error) by the absolute amount and the relative distribution of information contained in the network. A new principle of optimal information distribution is proposed and the conditions for the optimal system parameters are derived. For the simple, instructive example of a linear approximation of a non-linear, quadratic function, the principle of optimal information distribution gives the the optimal system parameters, i.e. the number of neurons and the different resolutions of the variables. T3 - Interner Bericht / Fachbereich Informatik, Johann Wolfgang Goethe-Universität Frankfurt a.M. - 91,1 Y1 - 1991 UR - http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/7949 UR - https://nbn-resolving.org/urn:nbn:de:hebis:30-79017 ER -