Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Help with Elman neural network
Replies: 4   Last Post: Nov 24, 2012 5:24 PM

 Messages: [ Previous | Next ]
 Greg Heath Posts: 6,387 Registered: 12/7/04
Re: Help with Elman neural network
Posted: Nov 24, 2012 5:24 PM

"Zeeshan" wrote in message <k8ptu3\$59n\$1@newscl01ah.mathworks.com>...
> "Greg Heath" <heath@alumni.brown.edu> wrote in message <k8ps7o\$64\$1@newscl01ah.mathworks.com>...
> > "Zeeshan" wrote in message <k8ppuh\$mdp\$1@newscl01ah.mathworks.com>...
> > > Hi,
> > > I have a time series data which has a healthy set and a test set.

What do you mean by healthy???

> > > I want to use Elman NN to make a one step ahead predicitons when applied on test set. But what should be the input series and target series for applying Elman NN.
> > >
> > > Also I'm confused, for a single variable time series to be modeled using NN, what should be the input vector and target vector.

> >
> > input=y(1:N-1);
> > output = y(2:N);

If you train any of the STATIC nets newff, newfit, feedforwardnet or fitnet on this data, you will get a one step ahead OPEN LOOP predictor.

However, if you want the net to recursively use its predictions to predict ahead further, you have to use a DYNAMIC net to obtain a CLOSED LOOP predictor.

It is not clear what you want.

You said you wanted to use an Elman net. However,

1. From the online documentation

help elmannet
-----------------------
Elman neural network.

Elman networks are provided for historical interest. For much better
results use narxnet, timedelaynet, or distdelaynet.

2. From the Reference page in Help browser

doc elmannet
-------------------------
Elman neural network

Elman networks are feedforward networks (feedforwardnet) with the
addition of layer recurrent connections with tap delays.

With the availability of full dynamic derivative calculations (fpderiv and bttderiv),
the Elman network is no longer recommended except for historical and research
purposes. For more accurate learning try time delay (timedelaynet), layer
recurrent (layrecnet), NARX (narxnet), and NAR (narnet) neural networks.

> MY APPLICATION:
> My data is in the form of cycles. One cycle has 2000 observations. I have ten cycles in healthy dataset and ten in test data set.
>
> I want to use a recurrent neural network by training it with my 10 cycles of healthy dataset
> and once that is trained, I want to it make a one step ahead prediction on each of the cycles of test dataset individually.
>
> For applying NARNET or Elman, we need two series target and input.

No. NARNET only uses ONE SERIES.

3. From using the commands "help" and "doc" it is readily determined that
a. TIMEDELAYNET has delayed inputs but no output feedback
b. NARNET has output feedback but no input
c. NARXNET has both delayed inputs and output feedback

Your problem can be solved with NARNET. Just follow the examples in

help narnet

doc narnet

help closeloop

doc closeloop

> But for time series we only have one variable, so is it correct to make the targets = (2000x10)=20000 observations and the inputs=2000 observations from the current test cycle?
>

No. Just follow the directions in the documentation.

> Also if these two (input and target) vectors are not equal, matlab gives an error. How do people train with a big datset and then apply that on a smaller test data?

Input and target sets must have the same number of cases when training.

However, for closed loop applications, there is only one series. Therefore, the output
has to be compared to the next input.

> So for example in the following code I dont know what to give as X and T and why?
> X = ?
> T= ?

X = y(1:N-1);
T = y(2,N);

Look Familiar?

> net = elmannet(1:2,10);

Compare this with NARNET.

> [Xs,Xi,Ai,Ts] = preparets(net,X,T);
> net = train(net,Xs,Ts,Xi,Ai);

I suggest that you get used to the process by using several much smaller data set.

Hope this helps.

Greg

Date Subject Author
11/24/12 Greg Heath
11/24/12 Greg Heath
11/24/12 Greg Heath