How can I manually perform an elmannet neural network calculation?

1 Ansicht (letzte 30 Tage)

Akzeptierte Antwort

Greg Heath
Greg Heath am 30 Nov. 2016
My guess is
z(t) = B1 + IW * [ x(t); z(t-1); z(t-2)];
y(t) = B2 + LW * z(t);
Hope this helps
Thank you for formally accepting my answer
Greg
  4 Kommentare
Qinwan Rabbani
Qinwan Rabbani am 5 Dez. 2016
Bearbeitet: Qinwan Rabbani am 5 Dez. 2016
This is the diagram for a 4 layer network with two hidden layers and a time delay of only 1:
This is net.IW:
This is net.LW:
From what I can see, hidden layer 1 and 2 feed back into themselves via "context layers", but not the output layer. The IW matrix only has the forward connections from the input layer to hidden layer 1. In the LW matrix, it looks to me that element 2,1 stores the connections from hidden 1 to hidden 2 and 3,2 has the connections from hidden 2 to output. However, there are separate elements storing the "context weights." 1,1 has connections from hidden 1 to itself and 2,2 has connections from hidden 2 to itself judging by the dimensions being (size of hidden)^2.
The code you've given is giving me errors because of the dimension mismatch since there is a separate set of connections for the recurrent/context weights. Regardless, even if I use the correct weights, what do I normalize the hidden activations to? The solution I referenced in my original question used normalization parameters that only apply to output activation. (I've tested applying them to saved hidden activations, and normalizing them with those parameters does not give me the correct results.)
Greg Heath
Greg Heath am 6 Dez. 2016
INCORRECT!
The input layer contains NON-NEURON fan-in units and is never counted when referring to a N-layer ( 1 output + N-1 hidden) neural net.
The equations I have posted are the equations for the DEFAULT 2-LAYER ELMAN net with 1 hidden layer.
As proof, just type in the code from the HELP OR DOC documentation and remove the ending semicolon from the view(net) command.
The diagram you have just shown is a NON-DEFAULT 3-layer ELMAN net with 2 hidden layers.
Hope this helps.
Thank you for formally accepting my answer
Greg

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by