Linear Threshold Boolean units (LTU) are the basic processing components of artificial neural networks of Boolean activations. Quantization of their parameters is a central question in hardware implementation, when numerical technologies are used to store the configuration of the circuit. In the previous studies on the circuit complexity of feedforward neural networks, no differences had been made between a network with ``small'' integer weights and one composed of majority units (LTU with weights in {-1,0, 1}), since any connection of weight w (w integer) can be simulated by |w| connections of value Sgn(w). This paper will focus on the circuit complexity of democratic networks, i.e. circuits of majority units with at most one connection between each pair of units. The main results presented are the following: any Boolean function can be computed by a depth-3 non-degenerate democratic network and can be expressed as a linear threshold function of majorities; AT-LEAST-k and AT-MOST-k are computable by a depth-2, polynomial size democratic network; the smallest sizes of depth-2 circuits computing PARITY are identical for a democratic network and for a usual network; the VC of the class of the majority functions is n 1, i.e. equal to that of the class of any linear threshold functions.
RRR-04-95.pdf
openaccess
275.69 KB
Adobe PDF
babf2cdc36794485c219867c9bd4c3da