Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
NCTM or The Math Forum.



Error in Selection of Optimum Parameters in Neural Network
Posted:
Apr 17, 2013 7:00 AM


Hello All, I have an error in selection of optimum configuration from the neural network model. I consist R2 statisticics for training, R2 statistics for training under degree of freedom adjustment and i choose the highest (R2 statistics for training under degree of freedom adjustment + R2 for Validation:).
Which is the best condition here: Case Neurons R2Train R2TrainDOF R2Val R2Tst MSETrain MSEval R2TrainDOF+R2Val 1 9 0.8901 0.8832 0.8799 0.751 0.109 0.119 1.7632 2 16 0.8906 0.8777 0.8871 0.785 0.109 0.111 1.7646 3 19 0.9005 0.8864 0.8641 0.768 0.099 0.1347 1.7505
In three case, which one is the best? 1) In my understanding, case 2 shows better results since R2 statics for training under biased i.e. degree of freedom adjustment plus R2 for validation is greater than rest case. But there is also contradiction that R2val (0.8871) is greater than R2 training under biased condition (0.8777), which should not exist. So, my optimum choice is case 1 which does not violate any rule.
Please, any one could suggest me. Further, as Prof. Greg Heath mentioned about Degree of Freedom adustment for (Hmax and Training equation... Hub=1+ceil((NtrneqO)/(I+O+1)); Hmax=round(Hub/8);), why this is done to avoid overfitting problems or what? Please, could you send the references so that i could site this equation for the conference paper which i am sending?
Thank you in advance.
Subodh



