Skip to main navigation Skip to search Skip to main content

Gain elimination from backpropagation neural networks

  • G. Thimm*
  • , E. Fiesler
  • , P. Moerland
  • *Corresponding author for this work

Research output: Contribution to conferencePaperAcademic

Abstract

It is shown that the gain of the sigmoidal activation function, used in backpropagation neural networks, can be eliminated since there exists a well-defined relationship between the gain, the learning rate, and the set of initial weights. Similarly, it also possible to eliminate the learning rate by adjusting the gain and the initial weights. The relationship is proven and extended to various variations of the backpropagation learning rule as well as applied to hardware implementations of neural networks.

Original languageEnglish
Pages365-368
Number of pages4
Publication statusPublished - 1995
EventProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6) - Perth, Aust
Duration: 27 Nov 19951 Dec 1995

Conference

ConferenceProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6)
CityPerth, Aust
Period27/11/199501/12/1995

Fingerprint

Dive into the research topics of 'Gain elimination from backpropagation neural networks'. Together they form a unique fingerprint.

Cite this