neural network nprtool tansig vs logsig
Ältere Kommentare anzeigen
Hello,
I am a little confused about the nprtool in the neural network toolbox. It generates a two layer feedforward network with a tansig activation on the output layer. However it expects a binary output with {0,1} and it seems to work right.
I wonder why doesn't it use logsig activation if the output will be {0,1}. When I manually modify the output activation to logsig the generated output gets compressed to [0.5,1] range, which is wrong.
I can't explain what seems to be the problem.
Thanks
Akzeptierte Antwort
Weitere Antworten (2)
Greg Heath
am 15 Apr. 2013
Bearbeitet: Greg Heath
am 15 Apr. 2013
You can not manually add logsig after being trained with purelin
logsig(0:1) = 0.5 0.7311
The net has to be trained with logsig.
Hope this helps.
Thank you for formally accepting my answer
Greg
2 Kommentare
Cagdas Ozgenc
am 28 Apr. 2013
Greg Heath
am 29 Apr. 2013
Please post the script , initial RNG state, and the results from using one of the MATLAB classification nndatasets.
Vito
am 29 Apr. 2013
0 Stimmen
This Fuzzy logic. AND = min(a,b), OR=max(a,b). The binary operator S can represent the addition(OR) boundary: S(1, 1) = 1, S(a, 0) = S(0, a) = a (logsig)
2 Kommentare
Greg Heath
am 29 Apr. 2013
Wha??
Vito
am 17 Mai 2013
Look at the theory, since classical logic, three-value and more fuzzy logic. The base of fuzzy logic is the algebra of Minima and Maxima which has the same properties, as Boolean algebra. In the help of ML about it it is told in general.
Kategorien
Mehr zu Define Shallow Neural Network Architectures finden Sie in Hilfe-Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!