Results on the Steepness in Backpropagation Neural Networks

The backpropagation algorithm is widely used for training multilayer neural networks. In this publication the steepness of its activation functions is investigated. In specific, it is discussed that changing the steepness of the activation function is equivalent to changing the learning rate and the weights. Some applications of this result to optical and other hardware implementations of neural networks are given.


Editor(s):
Aguilar, Marc
Published in:
Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing, 91-94
Presented at:
SI Group for Parallel Systems - Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing
Year:
1994
Publisher:
Institute of Informatics, University P\'erolles, Fribourg, Switzerland
Keywords:
Laboratories:




 Record created 2006-03-10, last modified 2018-03-17


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)