The ann program doesn't run all 50 epochs that specified.

1 Ansicht (letzte 30 Tage)
Tanmoy
Tanmoy am 30 Mai 2024
Beantwortet: Jayanti am 16 Sep. 2024
Here is the code:
inputs = readmatrix('C:\Users\tanmo\Downloads\Input-Station-2.xlsx');
targets = readmatrix('C:\Users\tanmo\Downloads\Target-Station-2.xlsx');
inputs = inputs'; % Transpose if necessary
targets = targets'; % Transpose if necessary
net = feedforwardnet([10 10]);
net.trainFcn = 'trainlm'; % Using Levenberg-Marquardt backpropagation
net.trainParam.epochs = 50;
net.trainParam.lr = 0.01;
net.trainParam.max_fail = 50; % Maximum validation failures
net.divideParam.trainRatio = 0.70;
net.divideParam.valRatio = 0.15;
net.divideParam.testRatio = 0.15;
net.trainParam.goal = 0; % Essentially remove the performance goal// added later
net.trainParam.min_grad = 1e-10; % Set a very small gradient goal
[net, tr]= train(net, inputs, targets);
outputs = net(inputs);
performance = perform(net, targets, outputs);
plotperform(tr)
I can't fix the number of iteration or epoch. It does a random iteration in different time and plots graph for random epoch values.

Antworten (1)

Jayanti
Jayanti am 16 Sep. 2024
When you are training the neural network, it may not run till the specified number of epochs if any of the stopping criteria are satisfied. There is possibility of the gradient value to fall below "min_grad" value 1e-10 as a result the training process will stop before running for every epoch.
From the attached image, you can see that the gradient value is less than the min_grad value at 15 epoch. Also, you can see the message minimum gradient reached at the bottom of the nntraintooldialog box.
To solve this issue, try increasing the value of "min_grad" or avoid specifying it.
Hope this helps!

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Produkte


Version

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by