Date: Feb 7, 2013 7:02 AM
Author: Greg Heath
Subject: Re: Neural Network performance normalization

"EanX" wrote in message <ket9j8$17c$1@newscl01ah.mathworks.com>...
> In designig a MLP NN I noticed that, for the inputs that I'm considering, I obtain a Performance value of 1.05e3 and this value become greater as inputs size increase, so I have to set
>
> net.performParam.normalization='percent';
>
> in order to have a performance value in range [0 1].
> Is this correct?


I don't know what you mean by performance value.

I recommend ALWAYS standardizing your inputs

[ xtrn, mu, std ] = zscore(Xtrn')';
xval = (Xval-mu)./std;
xtst = (Xval-mu)./std;

AND, standarizing your outputs if they are not binary.

If your current "performance value" is MSE , then normalize it with
MSE00 = mean(var(T,1)) to get

NMSE = MSE/MSE00
R2 = 1-MSE/MSE00

See the recent post on ANN_Error Goal

Hope tis helps.

Greg