Hidden Layer Activations in NN Toolbox

1 Ansicht (letzte 30 Tage)
Ahmed
Ahmed am 7 Jun. 2013
I'm looking for a non-manual way to compute the layer activations of an arbitrary neural network created with the Neural Network Toolbox.
Consider following example for detailed problem description:
[x,t] = simplefit_dataset; % toy data
net = fitnet(5); % initialize network
net.inputs{1}.processFcns = {}; % don't pre-process data
net.outputs{2}.processFcns = {}; % don't post-process data
net = train(net,x,t); % train network
The output of the network can be obtained using the sim function or manually:
testX = 0.5; % test value
testYnntbx = sim(net,testX) % automatic computation of network output
testYmanual = net.LW{2} ... % manual computation of network output
*(tansig(net.IW{1}*testX+net.b{1})) ...
+net.b{2}
The activations of the neurons in the hidden layer are:
testAmanual = tansig(net.IW{1}*testX+net.b{1})
I'm looking for a way to get the layer activations without manually specifying the equations, analogous to the sim function.

Akzeptierte Antwort

Greg Heath
Greg Heath am 8 Jun. 2013
You can create a 2nd net with no hidden layer. Next, make the output layer of the second net to be the same as the hidden layer of the first net.
net2 = fitnet([]);
will create a net with no hidden layer.
If the target matrix is not 5 dimensional, create a 5-fimensional target so that you can configure the correct topology. If t is 1-dimensional use
net2 = configure(net2,x, repmat(t,5,1));
Now you can replace the random initial weights of net2 with the hidden weights of net1.
I have not tried this. Therefore, there may be some details for you to work out.
Another approach might be to reproduce net1, net3 = net1, then remove the outer layer of net3. I can't see how to do this, but it may be possible.
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 Kommentar
Ahmed
Ahmed am 10 Jun. 2013
Replicating the reduced network is a viable workaround. However, every change in the original network potentially requires manual re-definition of the reduced network and is therefore a serious source of errors.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by