Moerland, PerryThimm, GeorgFiesler, Emile2006-03-102006-03-102006-03-101994https://infoscience.epfl.ch/handle/20.500.14299/227538The backpropagation algorithm is widely used for training multilayer neural networks. In this publication the steepness of its activation functions is investigated. In specific, it is discussed that changing the steepness of the activation function is equivalent to changing the learning rate and the weights. Some applications of this result to optical and other hardware implementations of neural networks are given.slope(adaptive) learning rateneuronbiasoptical implementationgainlearningneurocomputingmultilayer neural networkneural networkadaptive steepnesssigmoid steepnessneural computationconnectionismbackpropagationneural computing(sigmoid) steepnessinitial weightactivation functionoptical implementation.Results on the Steepness in Backpropagation Neural Networkstext::conference output::conference proceedings::conference paper