Capacity of two-layer feedforward neural networks with binary weights

The low er and upper bounds for the information capacity of two-layer feedforward neural networks with binary interconnections, integer thresholds for the hidden units, and zero threshold for the output unit is obtained through two steps, First, through a constructive approach based on statistical analysis, it is shown that a specifically constructed (N -2L -1) network with N input units, 2L hidden units, and one output unit is capable of implementing, with almost probability one, any dichotomy of O(W/1n W) random samples drawn from some continuous distributions, where W is the total number of weights of the network, This quantity is then used as a lower bound for the information capacity C of all (N -2L -1) networks with binary weights, Second, an upper bound is obtained and shown to be O(W) by a simple counting argument. Therefore, we have Omega(W/ln W) less than or equal to C less than or equal to O(W).


Published in:
Ieee Transactions On Information Theory, 44, 256-268
Year:
1998
Keywords:
Laboratories:




 Record created 2010-11-25, last modified 2018-03-17

Fulltext:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)