Date: Oct 18, 2013 3:35 PM
Subject: Problem with 1-step ahead prediction in neural network
I having a trouble with 1-step ahead of neural.
When I train network with fix parameter, I received another weight (IW,LW,b).
I know the reason is random intial weights. But why can we believe the predict result in 1-step if it alway changes for every train. May be the network not convergence. Because when it convergence, we just have only solution( or approximate solution). So is the network convergence?
All of things make the test result for 100 new predicted by neural network have many results, and some times different between so large.
Please help me fix these problems.
Thank you very much.