
Re: Neural Networks weights and bias help
Posted:
Jan 22, 2016 12:17 AM


"Vikas" wrote in message <n7qkqf$6o$1@newscl01ah.mathworks.com>... > "Greg Heath" <heath@alumni.brown.edu> wrote in message <n7ookr$kc3$1@newscl01ah.mathworks.com>... > > "Vikas" wrote in message <n7oepr$sju$1@newscl01ah.mathworks.com>... > > > "Greg Heath" <heath@alumni.brown.edu> wrote in message <kackrs$k0t$1@newscl01ah.mathworks.com>... > > > > "Renan" wrote in message <kaa61v$hau$1@newscl01ah.mathworks.com>... > > > > > Please, I´m in the same situation. Did you figure it out? > > > > > > > > What EXACTLY do you want to do? > > > > > > > > What version of MATLAB do you have? > > > > > > > > Please post the documentation you get from the command > > > > > > > > help newff > > > > > > > > Greg > > > > > > What's the difference between BTF (Backprop network training function, default = 'trainlm') and BLF (Backprop weight/bias learning function, default = 'learngdm') in the newff help document? Its some what confusing in the document, can you share your knowledge on this please? > > > > > > NEWFF(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF) takes, > > > P  RxQ1 matrix of Q1 representative Relement input vectors. > > > T  SNxQ2 matrix of Q2 representative SNelement target vectors. > > > Si  Sizes of N1 hidden layers, S1 to S(N1), default = []. > > > (Output layer size SN is determined from T.) > > > TFi  Transfer function of ith layer. Default is 'tansig' for > > > hidden layers, and 'purelin' for output layer. > > > BTF  Backprop network training function, default = 'trainlm'. > > > BLF  Backprop weight/bias learning function, default = 'learngdm'. > > > PF  Performance function, default = 'mse'. > > > IPF  Row cell array of input processing functions. > > > Default is {'fixunknowns','removeconstantrows','mapminmax'}. > > > OPF  Row cell array of output processing functions. > > > Default is {'removeconstantrows','mapminmax'}. > > > DDF  Data division function, default = 'dividerand'; > > > > trainlm is the levenbergmarqardt training function > > > > help trainlm > > doc trainlm > > > > learngdm is a backpropagation learning algorithm that uses gradient descent with momentum > > > > help learngdm > > doc traingdm > > > > Happy reading! > > > > Greg > > Thanks for reply Greg........ I know these algorithms but my question was why both "trainlm" and "learngdm" need to use in NEWFF(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF) as BTF and BLF? As far as ANN is concerned the weights and bais is to optimize through any one of the training algorithm like gradient descent or levenbergmarqwartd and many others algorithm, so what the need to use two algorithms trainlm and learngdm (aka, traingdm) at a time. What BLF is doing here? please share your large knowledge and wisdom over Matlab functions.
I am an engineer with a limited knowledge of source code. Some time ago, source code obtained using the command TYPE was relatively easy to understand. I guess MATLAB felt that was detrimental to the survival (or increase) of the profit level that they desired (or aspired).
Now, not so easy. Anyway, in addition to the HELP and DOC COMMANDS, try TYPE.
The best bet is to find the oldest versions of the code possible. I found the 2004 source code very helpful. However, I lost that and later versions in a computer crash. Now, my eyes glaze over when I see the source code.
Good Luck,
Greg

