Results on the Steepness in Backpropagation Neural Networks
The backpropagation algorithm is widely used for training multilayer neural networks. In this publication the steepness of its activation functions is investigated. In specific, it is discussed that changing the steepness of the activation function is equivalent to changing the learning rate and the weights. Some applications of this result to optical and other hardware implementations of neural networks are given.
Keywords: slope ; (adaptive) learning rate ; neuron ; bias ; optical implementation ; gain ; learning ; neurocomputing ; multilayer neural network ; neural network ; adaptive steepness ; sigmoid steepness ; neural computation ; connectionism ; backpropagation ; neural computing ; (sigmoid) steepness ; initial weight ; activation function ; optical implementation.
Record created on 2006-03-10, modified on 2016-08-08