Hidden Layer Activations in NN Toolbox
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
I'm looking for a non-manual way to compute the layer activations of an arbitrary neural network created with the Neural Network Toolbox.
Consider following example for detailed problem description:
[x,t] = simplefit_dataset; % toy data
net = fitnet(5); % initialize network
net.inputs{1}.processFcns = {}; % don't pre-process data
net.outputs{2}.processFcns = {}; % don't post-process data
net = train(net,x,t); % train network
The output of the network can be obtained using the sim function or manually:
testX = 0.5; % test value
testYnntbx = sim(net,testX) % automatic computation of network output
testYmanual = net.LW{2} ... % manual computation of network output
*(tansig(net.IW{1}*testX+net.b{1})) ...
+net.b{2}
The activations of the neurons in the hidden layer are:
testAmanual = tansig(net.IW{1}*testX+net.b{1})
I'm looking for a way to get the layer activations without manually specifying the equations, analogous to the sim function.
0 Kommentare
Akzeptierte Antwort
Greg Heath
am 8 Jun. 2013
You can create a 2nd net with no hidden layer. Next, make the output layer of the second net to be the same as the hidden layer of the first net.
net2 = fitnet([]);
will create a net with no hidden layer.
If the target matrix is not 5 dimensional, create a 5-fimensional target so that you can configure the correct topology. If t is 1-dimensional use
net2 = configure(net2,x, repmat(t,5,1));
Now you can replace the random initial weights of net2 with the hidden weights of net1.
I have not tried this. Therefore, there may be some details for you to work out.
Another approach might be to reproduce net1, net3 = net1, then remove the outer layer of net3. I can't see how to do this, but it may be possible.
Hope this helps.
Thank you for formally accepting my answer
Greg
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!