
Re: Neural Networks weights and bias help
Posted:
Dec 20, 2012 4:54 AM


> What EXACTLY do you want to do?
I have one year of stock data. My 12 input is fuzzified data My 12 output also fuzzified data. I want to compare the result from original data with neural network result. and also i want to predict the stock data. > What version of MATLAB do you have? Matlab version: MATLAB Version 7.10.0.499 (R2010a) > Please post the documentation you get from the command > > help newff NEWFF Create a feedforward backpropagation network. Syntax net = newff(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF) Description NEWFF(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF) takes, P  RxQ1 matrix of Q1 representative Relement input vectors. T  SNxQ2 matrix of Q2 representative SNelement target vectors. Si  Sizes of N1 hidden layers, S1 to S(N1), default = []. (Output layer size SN is determined from T.) TFi  Transfer function of ith layer. Default is 'tansig' for hidden layers, and 'purelin' for output layer. BTF  Backprop network training function, default = 'trainlm'. BLF  Backprop weight/bias learning function, default = 'learngdm'. PF  Performance function, default = 'mse'. IPF  Row cell array of input processing functions. Default is {'fixunknowns','removeconstantrows','mapminmax'}. OPF  Row cell array of output processing functions. Default is {'removeconstantrows','mapminmax'}. DDF  Data division function, default = 'dividerand'; and returns an N layer feedforward backprop network. The transfer functions TF{i} can be any differentiable transfer function such as TANSIG, LOGSIG, or PURELIN. The training function BTF can be any of the backprop training functions such as TRAINLM, TRAINBFG, TRAINRP, TRAINGD, etc. *WARNING*: TRAINLM is the default training function because it is very fast, but it requires a lot of memory to run. If you get an "outofmemory" error when training try doing one of these: (1) Slow TRAINLM training, but reduce memory requirements, by setting NET.trainParam.mem_reduc to 2 or more. (See HELP TRAINLM.) (2) Use TRAINBFG, which is slower but more memory efficient than TRAINLM. (3) Use TRAINRP which is slower but more memory efficient than TRAINBFG. The learning function BLF can be either of the backpropagation learning functions such as LEARNGD, or LEARNGDM. The performance function can be any of the differentiable performance functions such as MSE or MSEREG. Examples load simplefit_dataset net = newff(simplefitInputs,simplefitTargets,20); net = train(net,simplefitInputs,simplefitTargets); simplefitOutputs = sim(net,simplefitInputs); Algorithm Feedforward networks consist of Nl layers using the DOTPROD weight function, NETSUM net input function, and the specified transfer functions. The first layer has weights coming from the input. Each subsequent layer has a weight coming from the previous layer. All layers have biases. The last layer is the network output. Each layer's weights and biases are initialized with INITNW. Adaption is done with TRAINS which updates weights with the specified learning function. Training is done with the specified training function. Performance is measured according to the specified performance function. See also newcf, newelm, sim, init, adapt, train, trains
Reference page in Help browser doc newff > > Greg

