Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Error in Selection of Optimum Parameters in Neural Network
Replies: 3   Last Post: Apr 18, 2013 10:08 PM

 Messages: [ Previous | Next ]
 Subodh Paudel Posts: 2 Registered: 4/17/13
Error in Selection of Optimum Parameters in Neural Network
Posted: Apr 17, 2013 7:00 AM

Hello All,
I have an error in selection of optimum configuration from the neural network model. I consist R2 statisticics for training, R2 statistics for training under degree of freedom adjustment and i choose the highest (R2 statistics for training under degree of freedom adjustment + R2 for Validation:).

Which is the best condition here:
Case Neurons R2Train R2Train-DOF R2Val R2Tst MSETrain MSEval R2Train-DOF+R2Val
1 9 0.8901 0.8832 0.8799 0.751 0.109 0.119 1.7632
2 16 0.8906 0.8777 0.8871 0.785 0.109 0.111 1.7646
3 19 0.9005 0.8864 0.8641 0.768 0.099 0.1347 1.7505

In three case, which one is the best?
1) In my understanding, case 2 shows better results since R2 statics for training under biased i.e. degree of freedom adjustment plus R2 for validation is greater than rest case. But there is also contradiction that R2val (0.8871) is greater than R2 training under biased condition (0.8777), which should not exist. So, my optimum choice is case 1 which does not violate any rule.

Please, any one could suggest me. Further, as Prof. Greg Heath mentioned about Degree of Freedom adustment for (Hmax and Training equation... Hub=-1+ceil((Ntrneq-O)/(I+O+1));
Hmax=round(Hub/8);), why this is done to avoid overfitting problems or what?
Please, could you send the references so that i could site this equation for the conference paper which i am sending?