Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



Re: Neural Network performance normalization
Posted:
Feb 7, 2013 7:02 AM


"EanX" wrote in message <ket9j8$17c$1@newscl01ah.mathworks.com>... > In designig a MLP NN I noticed that, for the inputs that I'm considering, I obtain a Performance value of 1.05e3 and this value become greater as inputs size increase, so I have to set > > net.performParam.normalization='percent'; > > in order to have a performance value in range [0 1]. > Is this correct?
I don't know what you mean by performance value.
I recommend ALWAYS standardizing your inputs
[ xtrn, mu, std ] = zscore(Xtrn')'; xval = (Xvalmu)./std; xtst = (Xvalmu)./std;
AND, standarizing your outputs if they are not binary.
If your current "performance value" is MSE , then normalize it with MSE00 = mean(var(T,1)) to get
NMSE = MSE/MSE00 R2 = 1MSE/MSE00
See the recent post on ANN_Error Goal
Hope tis helps.
Greg



