Bounds on the Degree of High Order Binary Perceptrons
High order perceptrons are often used in order to reduce the size of neural networks. The complexity of the architecture of a usual multilayer network is then turned into the complexity of the functions performed by each high order unit and in particular by the degree of their polynomials. The main result of this paper provides a bound on the degree of the polynomial of a high order perceptron, when the binary training data result from the encoding of an arrangement of hyperplanes in the Euclidian space. Such a situation occurs naturally in the case of a feedforward network with a single hidden layer of first order perceptrons and an output layer of high order perceptrons. In this case, the result says that the degree of the high order perceptrons can be bounded by the minimum of the number of inputs and the number of hidden units.
RRR-44-95.pdf
openaccess
177.69 KB
Adobe PDF
3f09cc3cecb684314136a51b7a737a1f