View on GitHub

Bearded-android-docs

How to Choose the Momentum and Learning Parameters

Download this project as a .zip file Download this project as a tar.gz file

Created Saturday 07 December 2013

Unfortunately, networks can not learn using arbitrary learning and momentum parameters. You might think that the implementation is broken but you’re just using the wrong parameters. So develop an intuitive understanding of these parameters:

In general, you’ll find that difficult problems require a smaller learning parameter (but that increases the number of iterations of the backpropagation algorithm required). The momentum keeps the network from getting stuck but if its too large then it also keeps the network from learning.

Note that not all versions of the backpropagation algorithm are created equal. The classical :NeuralNetworks:BackPropagation:BackPropAlgorithms:BatchModeBackPropagation algorithm is very sensitive to these parameters, while other versions such as the :NeuralNetworks:BackPropagation:StochasticGradientBackPropagation are more robust.


No backlinks to this page.
comments powered by Disqus