Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: neural network does not generalize at all
Replies: 1   Last Post: May 25, 2014 6:48 PM

 Messages: [ Previous | Next ]
 Greg Heath Posts: 6,387 Registered: 12/7/04
Re: neural network does not generalize at all
Posted: May 25, 2014 6:48 PM

"Noa" wrote in message <llnmek\$h8v\$1@newscl01ah.mathworks.com>...
> Hi all,
> I have a regression problem, in which the inputs are vectors with values from 0 to 1 (pixels from grayscale images), and the outputs are from 0 to 10 (ranking the images).
> I'm using fitnet.
> The training always stops because of "minimum gradient reached". Accordingly, the performance on the training set is going down, but unfortunately, the performance on the validation and test sets stays very high (almost a plateu).
> Reducing the number of hidden neurons doesn't help.
>
> Does that mean that my features are no good, so that the network cannot generalize using them? Or is there anything else?

To begin understanding, simplify as much as possible:

[ I N ] = size(input) % = ?
[ O N ] = size(target) % = ?
Ntrn = N % No val or tst (divideFcn = 'dividetrain')
Ntrneq = N*O % No. of training equations
% Nw = (I+1)*H+(H+1)*O % No of unknown weights
% Ntrneq > Nw <==> H <= Hub
Hub = -1+ceil( (Ntrneq-O) / (I + O ))

% Choose a 2-D grid h = Hmin:dH:Hmax <= Hub , ntrial = 1:10 (10 different initial weight assignments for each h) and determine the smallest value of H that will yield a satisfactory design.

Now you can go back and use the default 'dividerand' . Don't forget to design at least 10 nets for every value of h that you try.

I probably have some relevant examples posted on the NEWSGROUP or ANSWERS. Search using some combination of the keywords

greg fitnet dividetrain Ntrials

Date Subject Author
5/23/14 Noa
5/25/14 Greg Heath