How to evaluate the GPU/CPU trainiing and inference time for a deep learning model in Matlab ?
16 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Currently, I need to build a deep learning digit recognition model in Matlab R2023a and fill out this table.
What would be the most efficient way to do this in Matlab ?
My GPU is RTX 3080 so I gues I could do it easily.
1 Kommentar
Antworten (1)
Debraj Maji
am 4 Jan. 2024
I understand that you are trying to evaluate the time required for training and inference for a Deep Learning Model. To evaluate the GPU/CPU training and inference time for a deep learning model in MATLAB, you can use the “tic” and “toc” functions. Additionally, MATLAB provides the “gputimeit” function to accurately measure the time taken by GPU operations.
Training time can be found out using the training progress window which pops up after training starts on the top right hand corner of the page.
Here is the sample code for using “tic” and “toc” functions for inference:
tic;
predictions = classify(trainedNet, testData);
elapsedTime = toc;
For further information on the “tic” and “toc” functions you can refer to the following documentation:
Attached below is the sample code for the “gputimeit” function which is used to accurately measure time taken for processes on a GPU:
inferenceFcn = @() classify(trainedNet, testData);
if canUseGPU()
gpuTime = gputimeit(inferenceFcn);
fprintf('Inference time on GPU: %f seconds\n', gpuTime);
end
The Parallel Processing Toolbox is required to run the function “gputimeit” and it will give error otherwise.
For further information on “gputimeit” function you can refer to the following documentation:
Hope this helps,
Regards,
Debraj.
0 Kommentare
Siehe auch
Kategorien
Mehr zu GPU Computing finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!