Back propagation neural network
7 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I learnt that the output of activation functions, logsig and tansig, returns the value between [0 1] and [-1 1] respectively. What will happen if the target values are beyond these limits?
2 Kommentare
Mohammad Sami
am 8 Jun. 2020
One of the reasons is that larger values can result in a problem of exploding gradients, when training the network.
Antworten (0)
Siehe auch
Kategorien
Mehr zu Define Shallow Neural Network Architectures finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!