Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
NCTM or The Math Forum.



Training NN keeps giving same answer
Posted:
Dec 11, 2012 4:31 PM


I have a simple nn classifier (within a parfor loop), and have a function that runs it five times and spits out the percentage of correct classifications (and area under ROC curve) for each of the five networks.
The problem is, when I run this program multiple times, I am getting the same five numbers (e.g., 55.55, 60, 78.22). It doesn't matter how many times I run it.
net.layers{ii}.initFcn is set to 'initnw' for ii=1,2. I am using backprop on a net built with 'patternnet'.
The weird thing is I run the same code in the for loop five times, and get five different answers, but the same five different answers, in the same order, every time I run it. The answer is reasonable, incidentally, but where is the randomness?
Do I need to reseed the random number generator somehow with each run? How can I do that for NN training?
Thanks for any help.



