For my project work I have used Elman neural network, with Resilient back propagation algorithm, Nguyen widrow algorithm for generating initial layer values. I observed lot difference between outputs in different trials

1 visualización (últimos 30 días)
For my project work I have used Elman neural network(ENN), with Resilient back propagation algorithm, Nguyen widrow algorithm for generating initial layer values. I observed lot of difference between outputs for different trial, when for the first time I trained network it gave 94% accuracy and the second time with same inputs and targets I got 64% only. After training for the first time I didn't saved the network. please suggest me ways to avoid the difference between consecutive trials. I am using Matlab 2010 and I created ENN using nntool, and then using code I turned it into 'trainrp' as creating ENN with 'trainrp' gave me error.

Respuesta aceptada

Walter Roberson
Walter Roberson el 25 de Mzo. de 2014
Weights are initialized randomly for Neural Networks unless you initialize them manually. If you are getting as large of a difference as you are seeing, it should suggest to you that your network is not robust.
You can control the random number seed to reproduce particular networks. In R2010 you should see the documentation for the randstream class; R2011a or so introduced rng() as a simpler way to set the random seed.
  1 comentario
Balaji
Balaji el 25 de Mzo. de 2014
I used Nguyen widrow algorithm to initialize weights I used net = initnw(net,layer); but still the result was havind great variations.

Iniciar sesión para comentar.

Más respuestas (1)

Greg Heath
Greg Heath el 26 de Mzo. de 2014
Search for one of my design examples
greg net rng(0)
greg net rng(4151941)
greg net rng(default)
As Walter has mentioned, there are probably more up-to-date ways of initializing the RNG.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by