What is the meaning of the training state "Sum Squared Param (ssX )" while training neural network with Bayesian Regularization algorithm?
10 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Merve Okan
am 4 Sep. 2023
Bearbeitet: Harsha Vardhan
am 11 Sep. 2023
While solving an Input-Output Fitting problem with a Neural Network by training with Bayesian Regularization algorithm, we can plot neural network training state. I attached an example figure here. The question I would like to ask that what is the meaning of Sum Squared Param (ssX) ? I just learnt "Num paramaters" is corresponding to effective number of paramaters but when I searched for "Sum Squared Param" I could't find any direct explanation. Is it sum squared weights (SSW)?
0 Kommentare
Akzeptierte Antwort
Harsha Vardhan
am 11 Sep. 2023
Bearbeitet: Harsha Vardhan
am 11 Sep. 2023
Hi,
I understand that you want to know the meaning of the “Sum Squared Param (ssX)” in the plot of the neural network state. This term means the Sum of the Squared Parameters. Here, parameters refer to the neural network’s weight and bias vector.
This conclusion can be drawn from the following sources.
Execute the following command in the command window to view the source code of ‘trainbr.m’ (Bayesian Regularization training function).
open trainbr
In the above source code, the following lines explain about ‘ssX’.
Line 195: This line stores the neural network’s weight and bias vector in a variable.
worker.WB = calcLib.getwb(calcNet);
Line 211: This line calculates the ‘ssX’ by multiplying the transpose of weight and bias vector with itself.
worker.ssX = worker.WB' * worker.WB;
Next, to understand the value returned by calcLib.getwb in the line 195 of trainbr.m file, navigate to the 'getwb' function defined in 'getwb.m'. This can be done using 'CTRL+D' after highlighting 'getwb' in that line.
As seen in the above source code of 'getwb.m', ‘getwb’ function returns all network weight and bias values as a single vector:
function wb = getwb(net,hints)
%GETWB Get all network weight and bias values as a single vector.
Using the above resources, we can conclude that the "Sum Squared Param (ssX)" refers to the squared neural network’s weight and bias vector. This is used for the regularization.
0 Kommentare
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!