Save intermediate model in matlab while training a deep learning model and resume training from that stage later
7 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
parvathy prathap
am 12 Okt. 2021
Kommentiert: SIMON
am 26 Jul. 2025
Hi,
I am training a 3 pipeline deep learning model in matlab which takes a lot of time to train. I need to store intermediate variable values while training, stop the training process and then resume training at a later point from the stage at which the training was stopped previously. Does Matlab have any options to do this? Any help in this regard would be highly appreciated.
Thanks in Advance
0 Kommentare
Akzeptierte Antwort
Mahesh Taparia
am 15 Okt. 2021
Hi
You can set the checkpoint path in trainingOptions as suggested in the above answer. The trained weights will be saved into the specified path as a mat file. To resume the training process, you can load those weights in the net variable and start the training process. For example, you can refer this documentation.
Hope it will help!
1 Kommentar
Jeet Agrawal
am 13 Apr. 2023
Do i need to create another file and load model and train from their?
suppose i am saving model after 50 iteration using
checkpointfrequency =50 and checkpointunit = iteration
lastly it has saved checkpoint at 750 now how to run again from 751?
As per link i can reduce my epoch but how can i reduce iteration?
suppose my data set has 20000 files and minibatchsize is 40 for one epoch it will take 500 iteration. I am not able to wait for 500 iteration to complete because it will take around 5 hours. so, want to save at 50 epochs only.
could you please share snipet?
Weitere Antworten (1)
yanqi liu
am 15 Okt. 2021
sir,may be use CheckpointPath,such as
options = trainingOptions('sgdm', ...
'MaxEpochs', 5, ...
'MiniBatchSize', 1, ...
'InitialLearnRate', 1e-3, ...
'CheckpointPath', tempdir);
1 Kommentar
SIMON
am 26 Jul. 2025
Hello everyone, here is a nice example of how you can save your model everytime you train it again and again and again:
% 🧱 Define LSTM model
layers = [
sequenceInputLayer(size(featuresNorm, 2))
lstmLayer(50, 'OutputMode', 'last') % Capture temporal info
dropoutLayer(0.2) % Prevent overfitting
fullyConnectedLayer(size(targetNorm, 2))
regressionLayer
];
% ⚙️ Set training options
options = trainingOptions('adam', ...
'MaxEpochs', 100, ...
'MiniBatchSize', 32, ...
'InitialLearnRate', 0.005, ...
'GradientThreshold', 1, ...
'Shuffle', 'every-epoch', ...
'ValidationFrequency', 30, ...
'ValidationPatience', 5, ...
'Plots', 'training-progress', ...
'Verbose', false);
% Load the model
if isfile('trainedLSTMModel.mat')
load('trainedLSTMModel.mat', 'net');
layers = net.Layers
end
% 🚀 Train the network
net = trainNetwork(featureSequence, targetNorm, layers, options);
save('trainedLSTMModel.mat', 'net');
Now can you lovely people please visit my website called spacetripping and have a lovely time.
Siehe auch
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!