Results on the Steepness in Backpropagation Neural Networks
1994
Résumé
The backpropagation algorithm is widely used for training multilayer neural networks. In this publication the steepness of its activation functions is investigated. In specific, it is discussed that changing the steepness of the activation function is equivalent to changing the learning rate and the weights. Some applications of this result to optical and other hardware implementations of neural networks are given.
Détails
Titre
Results on the Steepness in Backpropagation Neural Networks
Auteur(s)
Moerland, Perry ; Thimm, Georg ; Fiesler, Emile
Publié dans
Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing
Editeur(s)
Pages
91-94
Présenté à
SI Group for Parallel Systems - Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing
Date
1994
Editeur
Institute of Informatics, University P\'erolles, Fribourg, Switzerland
Mots-clés (libres)
slope; (adaptive) learning rate; neuron; bias; optical implementation; gain; learning; neurocomputing; multilayer neural network; neural network; adaptive steepness; sigmoid steepness; neural computation; connectionism; backpropagation; neural computing; (sigmoid) steepness; initial weight; activation function; optical implementation.
Laboratoires
LIDIAP
Le document apparaît dans
Production scientifique et compétences > STI - Faculté des sciences et techniques de l'ingénieur > IEM - Institute of Electrical and Micro Engineering > LIDIAP - Laboratoire de l'IDIAP
Production scientifique et compétences > Euler Center for Signal Processing
Papiers de conférence
Travail produit à l'EPFL
Publié
Production scientifique et compétences > Euler Center for Signal Processing
Papiers de conférence
Travail produit à l'EPFL
Publié
Date de création de la notice
2006-03-10