Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.matlab

Topic: Data division problem in neural network
Replies: 1   Last Post: Apr 27, 2013 6:17 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Greg Heath

Posts: 5,923
Registered: 12/7/04
Re: Data division problem in neural network
Posted: Apr 27, 2013 6:17 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

"srishti" wrote in message <kl6731$8a8$1@newscl01ah.mathworks.com>...
> Hello,
> I am using neural network pattern recognition for classification purpose.My problem is I am not getting whether for input and target both I have to write individual command "trainInd,valInd,testInd] = divideind(Q,trainInd,valInd,testInd)"
> and if yes then how to define parameters(net.divideparm)? I have written the following code.I have read the documentation but I am not able to clear my doubt. Please help.
>
> s = RandStream('mcg16807','Seed', 0);
> RandStream.setDefaultStream(s)
> x=input; % size of x is 10x70
> t=target;% size of t is 3x70


[I N ] = size(x) % [ 10 70 ]
[O N ] = size(t) % [ 3 70 ]

% The default division ratios are 0.7/0.15/0.15
Ntst =round(0.15*N) % 11
Nval = Ntst % 11
Ntrn = N-2*Ntst % 48
trainind = 1:Ntst
valind = Ntrn+1:Ntrn+Nval
tstind = Ntrn+Nval+1:N
Ntrneq = Ntrn*O % 144 No. of training equations

% No. of unknown weights
% Nw = (I+1)*H+(H+1)*O
% For more equations than unknowns H <= Hub

Hub= -1+ ceil( (Ntrneq -O) / (I+O+1) ) % 10

> net = patternnet(22);

No. Choose H by trial and error for 0 <= H <= 10

H = 5 % Prefer a loop H = 0:1:10
net = patternnet(H); % remove semicolon to see default parameter values

> net.divideFcn='divideInd';
> [trainInd,valInd,testInd] = divideind(x,1:20,35:45,54:65);


Replace x with N
Not sure why you are not using all of the data
[trainInd,valInd,testInd] = divideind(N,trnind,valind,tstind);

> net.divideParam.trainInd = trainInd
> net.divideParamvalInd = valInd
> net.divideParam.testInd = testInd
> net= train(net,x,t);


[ [ net tr y ] = train(net,x,t); % y is output

net = net % See all of the input parameters

tr = tr % See all of the training results.

Hope this helps.

Greg



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.