Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.matlab

Topic: Improving ANN results
Replies: 16   Last Post: Nov 13, 2013 9:10 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Greg Heath

Posts: 5,955
Registered: 12/7/04
Re: Improving ANN results
Posted: Oct 15, 2013 7:07 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

"chaudhry " <bilal_zafar9@yahoo.com> wrote in message <l3ebuu$evs$1@newscl01ah.mathworks.com>...

> How to improve ANN results by reducing error through hidden layer size, through MSE, or by using while loop?

Your data is not a good learning example. (Small size, constant x(1,:), weak relationship between input and target )

1. Practice on MATLAB data (e.g., simplefit_dataset)
help nndata
2. Size the data
[ I N ] = size(x)
[ O N ] = size(t)
% Default Data Division
Ntst = round(0.15*N)
Nval = Ntst
Ntrn = N-2*Ntst;
Ntrneq = Ntrn*O % No. of training equations
3. Standardize (zscore or mapstd) and plot data.
zx= zscore(x',1);
zt = zscore(t',1);
4. Remove, modify or delete outliers. If necessary, repeat 3.
5. Start with the examples in the help documentation and accept all defaults.
6. Ignore the GUI obtained code. It is too confusing because it lists all options.
7. Use a for loop to design Ntrials (>=10) different nets to mitigate using default random data divisions and random initial weights. To obtain accurate generalization estimate statistics, Ntrials*Ntst should be sufficiently large.
a. Ntrials >= max( 10, 30/Ntst )
b. Initialize the random number generator before the loop
c. Use configure at the top of the loop to randomize initial weights
d. Obtain the training record tr via
[ net tr ] = train( net, zx, zt );
8. Rank the nets w.r.t. the validation set R-squared (see http://en.wikipedia.org/wiki/R-squared) and ignore very poor designs. If not enough designs survive, design more.
MSEval = tr.best_vperf;
R2val = 1-MSEval; % tval assumed standardized
9. To obtain an UNBIASED estimate of performance on unseen data, obtain the mean and stdv of the MSEtst = tr.best_tperf (or R2tst) values for the surviving nets.
10. If results are unsatisfactory consider increasing the number of hidden nodes.
11. Otherwise, try to reduce the number of hidden nodes to increase robustness
w.r.t. noise, measurement error and outliers (although an outlier check should
always be performed before using any net)
12. Search my posts in the NEWSGROUP and ANSWERS posts for multi-loop examples.
Good search words are:
neural greg fitnet Ntrials (for regression and curve-fitting)
neural greg patternnet Ntrials (for classification and pattern-recognition)
13. If you are designing timeseries (e.g., timedelaynet ,narnet or narxnet)
a. Consider the delays of the significant correlations in the target/target
autocorrelation function and/or the target/input crosscorrelation function.
b. Keep timesteps uniform by using divideblock and or divideind.

Hope this helps.

Greg





Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.