Results on the Steepness in Backpropagation Neural Networks
1994
Abstract
The backpropagation algorithm is widely used for training multilayer neural networks. In this publication the steepness of its activation functions is investigated. In specific, it is discussed that changing the steepness of the activation function is equivalent to changing the learning rate and the weights. Some applications of this result to optical and other hardware implementations of neural networks are given.
Details
Title
Results on the Steepness in Backpropagation Neural Networks
Author(s)
Moerland, Perry ; Thimm, Georg ; Fiesler, Emile
Published in
Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing
Editor(s)
Pages
91-94
Conference
SI Group for Parallel Systems - Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing
Date
1994
Publisher
Institute of Informatics, University P\'erolles, Fribourg, Switzerland
Keywords
slope; (adaptive) learning rate; neuron; bias; optical implementation; gain; learning; neurocomputing; multilayer neural network; neural network; adaptive steepness; sigmoid steepness; neural computation; connectionism; backpropagation; neural computing; (sigmoid) steepness; initial weight; activation function; optical implementation.
Laboratories
LIDIAP
Record Appears in
Scientific production and competences > STI - School of Engineering > IEM - Institut d'Electricité et de Microtechnique > LIDIAP - L'IDIAP Laboratory
Scientific production and competences > Euler Center for Signal Processing
Conference Papers
Work produced at EPFL
Published
Scientific production and competences > Euler Center for Signal Processing
Conference Papers
Work produced at EPFL
Published
Record creation date
2006-03-10