Back propagation neural network

7 Ansichten (letzte 30 Tage)
Sivamani S
Sivamani S am 8 Jun. 2020
Kommentiert: Sivamani S am 8 Jun. 2020
I learnt that the output of activation functions, logsig and tansig, returns the value between [0 1] and [-1 1] respectively. What will happen if the target values are beyond these limits?
  2 Kommentare
Mohammad Sami
Mohammad Sami am 8 Jun. 2020
One of the reasons is that larger values can result in a problem of exploding gradients, when training the network.
Sivamani S
Sivamani S am 8 Jun. 2020
Dear Mohammed Sami,
I gave 3 inputs and 1 target value in BPNN. I obtained values of weights and bias. When I fit manually weights and bias to the ANN model, I am not getting output what ANN provides. I tried for 1 sample run. Is there any way to check the ANN output manually?

Melden Sie sich an, um zu kommentieren.

Antworten (0)

Produkte

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by