Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Training multiple data for a single feedforwardnet
Replies: 19   Last Post: Dec 1, 2012 6:14 PM

 Messages: [ Previous | Next ]
 Greg Heath Posts: 6,387 Registered: 12/7/04
Re: Training multiple data for a single feedforwardnet
Posted: Oct 19, 2012 6:44 AM

"Carlos Aragon" wrote in message <k5n37h\$ier\$1@newscl01ah.mathworks.com>...
> I'm building a feedforwardnet like this:
>
> (..)
> P=[V';ia';w'];
> T=[tq'];
> net=feedforwardnet([5 25],'trainbr');

One hidden layer with H nodes is sufficient. Try to minimize H by trial and error. Start with 10 small values of H and Ntrials = 10 of random initial weights for each value of H. For examples search using some of the following keywords:

heath close clear Ntrials Neq Nw

> (..)
>
> How could i train this neural net for more then one group '[V';ia';w']' ? How is the matlab structure to perform this kind of training?

For I dimensional inputs and O-dimensional outputs

[ I N ] = size(input)
[ O N] = size(target)

yielding

Neq = N*O training equations for estimating

Nw = (I+1)*H+(H+1)*O weights.

Neq >= Nw when

H <= (Neq-O)/(I+O+1)

> Note that 'P' in this case is a 10006x3 matrix that i extract from a motor model.

That is a large number of samples. You can probably use a simple 0.34/0.33/0.33 trn/val/tst random data split, net.divideFcn = 'dividetrain' and 'trainlm'.

Hope this helps.

Greg

Date Subject Author
10/17/12 Carlos Aragon
10/19/12 Greg Heath
10/20/12 Greg Heath
10/29/12 Carlos Aragon
10/31/12 Greg Heath
11/1/12 Carlos Aragon
11/1/12 Greg Heath
11/3/12 Carlos Aragon
11/3/12 Greg Heath
11/5/12 Carlos Aragon
11/16/12 Carlos Aragon
11/17/12 Greg Heath
11/17/12 Carlos Aragon
11/17/12 Greg Heath
11/17/12 Carlos Aragon
11/17/12 Greg Heath
11/18/12 Carlos Aragon
11/18/12 Greg Heath
12/1/12 Carlos Aragon
10/29/12 Carlos Aragon