Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.matlab

Topic: neural network
Replies: 3   Last Post: Mar 14, 2013 12:46 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Greg Heath

Posts: 5,953
Registered: 12/7/04
Re: neural network
Posted: Mar 14, 2013 12:46 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

"srishti" wrote in message <khpvr7$qiu$1@newscl01ah.mathworks.com>...
> Thanks for your help. But can you please tell me how to do that?

Initialize the rng only once in the program before

1. Data division
2. Weight initialization

You don't have to know exactly where these occur. Just initialize the
RNG at the beginning of the program.

See

help rng
doc rng

Hope this helps.

Greg

P.S. Data division and Weight intialization are performed at different points
in different NNTBX versions.

1. In very old versions random weights are automatically assigned at net creation.
However, you have to do the data division yourself.

net = newff( minmax(x), [H O] );

2. In the latest obsolete version, (obsoleted in R2010b NNET 7.0), NEWFF and
special forms (NEWFIT for regregession & curvefitting ; NEWPR for classification
& pattern recognition) BOTH weight initialization and data division are automatically
performed at net creation:

net = newpr( x, t, H );

help newpr
doc newpr

3. In the current version, special forms of FEEDFORWARDNET(FITNET & PATTERNNET)
can be used. However, NEITHER initial weight initialization NOR data division are automatically performed at net creation. By default BOTH automatically occur at the FIRST call of TRAIN. However,

a. If you are looping over multiple candidate designs, weights will not reinitialize at subsequent calls of TRAIN. Therefore, it is best to explicitly initialize weights using CONFIGURE before calling TRAIN. Unless, for some reason, you want to continue the same design.

b. I don't think that subsequent calls of TRAIN in a loop will automatically redivide the data. Although this is not critical when searching for well performing weight confiurations, I will check and reply if I am wrong.



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.