trainnet function - randomness and repeatability

Hello all,
I'm checking the trainnet function. I'm running same script multiple times and training outputs are slightly different. Even though I remove all points which brings any randomness in (I'm aware about) - like random label splitting, batches schuffling, etc.
So may I ask you, does anybody know what causes that outcomes are slightly different each time? Please find below basic steps of my script.
training_imds = imageDatastore(Training_data_folder,"IncludeSubfolders",true,"Labelsource","foldernames");
%-----------------------
% Training data split
% only part of training dataset can be used for the training
% training_imdsVal is not used
[training_imds_Train,training_imdsVal] = splitEachLabel(training_imds,0.3);
training_imds = training_imds_Train;
%-----------------------
% Training process - train - val data split
[training_imds_Train,training_imds_Val] = splitEachLabel(training_imds,0.9);
training_imds_Train_au = augmentedImageDatastore([imHeight imWidth],training_imds_Train);
training_imds_Val_au = augmentedImageDatastore([imHeight imWidth],training_imds_Val);
layers = [
imageInputLayer([imHeight imWidth 3]) % image size and RGB (=3)
convolution2dLayer(20,20)
reluLayer()
maxPooling2dLayer(3)
fullyConnectedLayer(2)
softmaxLayer()
];
options = trainingOptions("sgdm", ...
Metrics="accuracy", ...
InitialLearnRate=0.000001, ...
ValidationData=training_imds_Val_au,...
MiniBatchSize=128,...
ValidationFrequency=25,...
ValidationPatience=5,...
MaxEpochs = 1,...
LearnRateSchedule = 'piecewise',...
LearnRateDropPeriod = 5,...
ExecutionEnvironment='cpu');
trained_net = trainnet(training_imds_Train_au,layers,"crossentropy",options);
When I run this more times, different outcomes are received, e.g.:
Iteration Epoch TimeElapsed LearnRate TrainingLoss ValidationLoss TrainingAccuracy ValidationAccuracy
_________ _____ ___________ _________ ____________ ______________ ________________ __________________
0 0 00:00:10 1e-06 1.4484 81.864
1 1 00:00:10 1e-06 5.8723 60.156
25 1 00:01:31 1e-06 1.3652 0.40944 91.406 97.229
50 1 00:03:17 1e-06 0.12455 1.051e-09 99.219 100
55 1 00:03:43 1e-06 0.16857 0 98.438 100
Training stopped: Max epochs completed
Iteration Epoch TimeElapsed LearnRate TrainingLoss ValidationLoss TrainingAccuracy ValidationAccuracy
_________ _____ ___________ _________ ____________ ______________ ________________ __________________
0 0 00:00:09 1e-06 7.5433 51.637
1 1 00:00:10 1e-06 8.6849 41.406
25 1 00:01:27 1e-06 0.9964 0.14055 93.75 99.118
50 1 00:03:03 1e-06 0.62306 0.022228 96.094 99.748
55 1 00:03:23 1e-06 0.12455 0.009824 99.219 99.874
Training stopped: Max epochs completed

 Akzeptierte Antwort

Steven Lord
Steven Lord vor etwa 20 Stunden
What happens when you reset the state or seed of the random number generator before each attempt to train the network? Let's choose an arbitrary seed value and generate some numbers.
rng(42)
x1 = rand(1, 5);
If we reset the seed to the same value, the generator starts in the same place and generates the same numbers.
rng(42)
x2 = rand(1, 5);
isequal(x1, x2) % Same values, down to the last bit
ans = logical
1
But generating new values doesn't generate the same values as the freshly-reset generator.
x3 = rand(1, 5);
isequal(x1, x3) % No, x3 contains different values
ans = logical
0
You may have removed the randomness from your code, but I believe the network may be initialized with random starting values for the training internally.

1 Kommentar

Oldrich
Oldrich vor etwa eine Stunde
Thank you very much for the clarification. I was not familiar with these random generator setting options.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (2)

Matt J
Matt J vor etwa 19 Stunden
Bearbeitet: Matt J vor etwa 19 Stunden

0 Stimmen

In addition to random initialization of the Learnables, as mentioned by @Steven Lord, you are using the default Shuffle setting, which performs a random reordering of the training inputs once at the beginning of the training process.

1 Kommentar

Oldrich
Oldrich vor etwa eine Stunde
Thank you for your comment - you are right about shuffling.

Melden Sie sich an, um zu kommentieren.

Oldrich
Oldrich vor etwa 17 Stunden

0 Stimmen

When I applied suggestions of @Steven Lord and @Matt J - reset random generator at the beginning of each run and remove suffling totally - same results were received.
@Steven Lord @Matt J Thank you both very much for clarification of these net training details.

1 Kommentar

Matt J
Matt J vor etwa 15 Stunden
If you reset the random number generator, turning off shuffling should make no difference.

Melden Sie sich an, um zu kommentieren.

Kategorien

Produkte

Version

R2025b

Gefragt:

am 9 Apr. 2026 um 12:03

Kommentiert:

vor etwa 15 Stunden

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by