Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Results on the Steepness in Backpropagation Neural Networks
 
conference paper

Results on the Steepness in Backpropagation Neural Networks

Moerland, Perry
•
Thimm, Georg
•
Fiesler, Emile
Aguilar, Marc
1994
Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing
SI Group for Parallel Systems - Proceedings of the '94 SIPAR-Workshop on Parallel and Distributed Computing

The backpropagation algorithm is widely used for training multilayer neural networks. In this publication the steepness of its activation functions is investigated. In specific, it is discussed that changing the steepness of the activation function is equivalent to changing the learning rate and the weights. Some applications of this result to optical and other hardware implementations of neural networks are given.

  • Details
  • Metrics
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés