How to use gpu for deep learning

11 Ansichten (letzte 30 Tage)
Alexey Kozhakin
Alexey Kozhakin am 1 Okt. 2022
Beantwortet: KSSV am 1 Okt. 2022
I’m training detection model yolov4 on matlab. I just got a computer with a graphic card, the Nvidia GeForce RTX 3070 Ti. I want to get the maximum from it. Please help me, what I need to write in matlab code to perform training using GPU.

Antworten (1)

KSSV
KSSV am 1 Okt. 2022
Check the trainingOptions, in there you have option to specify the execution environment.
Example:
options = trainingOptions('sgdm', ...
'Momentum',0.9, ...
'InitialLearnRate',initLearningRate, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',learningDropPeriod, ...
'LearnRateDropFactor',learningRateFactor, ...
'L2Regularization',l2reg, ...
'ExecutionEnvironment', 'auto',....
'ValidationPatience',Inf,...
'MaxEpochs',maxEpochs, ...
'ValidationData',{inputVal, targetVal}, ...
'ValidationFrequency',50,...
'shuffle','every-epoch',....
'MiniBatchSize',miniBatchSize, ...
'GradientThresholdMethod','l2norm', ...
'GradientThreshold',0.01, ...
'Plots','training-progress', ...
'ExecutionEnvironment', 'auto',..... %<------ check this. Keep it auto so MATLAB can pick the best
'ValidationPatience', 10, ...
'Verbose',true);

Produkte


Version

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by