Equation From a simple feedforward neural network
4 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Mohamed BENALLAL
am 16 Apr. 2016
Kommentiert: Mohamed BENALLAL
am 19 Apr. 2016
Hi every one, I'm working on a code which provide the full equation from a FeedForword Neural Network (FNN) in a text file consedering all weights and biases : I have already the trained FNN stored (the "net" file), the first step is to see if I have the same result when using :
load Net17 net
input = [12,0.2]; % an input example
output = net(input');
and when I do this :
IW = net.IW{1,1} ;
b1 = net.b{1};
b2 = net.b{2};
LW = net.LW{2,1};
y = b2 + LW * tansig( IW * input' +b1 );
Note that my net is a simple FNN with one hiden layer; 2 input neurons, 17 hiden neurons (for this example) and one output neuron.
I don't know if a did a mistake but the result is diffrent :
output = 353.3947
y = -7.8709
any suggestions ??? Thanks...
References :
http://fr.mathworks.com/help/nnet/ug/multilayer-neural-network-architecture.html?refresh=true http://fr.mathworks.com/help/nnet/ref/setwb.html http://fr.mathworks.com/help/nnet/ref/tansig.html http://fr.mathworks.com/matlabcentral/answers/165233-neural-network-how-does-neural-network-calculate-output-from-net-iw-net-lw-net-b
0 Kommentare
Akzeptierte Antwort
Greg Heath
am 19 Apr. 2016
You forgot that the net uses the default MAPMINAX to normalize the input and target before training and, then, to denormalize the output.
Hope this helps.
Thank you for formally accepting my answer
Greg
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Define Shallow Neural Network Architectures finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!