predict fucntion in deep learning toolbox does not use gpu
6 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I use a pre-trained network from tensorflow 2.0 to predict a depth image from an RGB image. The code is:
dlX = dlarray(double(I)./255,'SSCB');
dlY = predict(dlnet,dlX);
The code works fine, but it is very slow. I find that it seems that the code only use the cpu core instead of gpu.
From the online help document, I find the following explanation:
It seems that the default way to run predict is to use a gpu. I find my gpu seems to be avaliable in MATLAB by running the gpu test function like:
gpuDevice;
A = gpuArray([1 0 1; -1 -2 0; 0 1 -1]);
e = eig(A);
It works fine with my gpu:
Name: 'GeForce RTX 2060'
Index: 1
ComputeCapability: '7.5'
SupportsDouble: 1
DriverVersion: 11.2000
ToolkitVersion: 11
MaxThreadsPerBlock: 1024
MaxShmemPerBlock: 49152
MaxThreadBlockSize: [1024 1024 64]
MaxGridSize: [2.1475e+09 65535 65535]
SIMDWidth: 32
TotalMemory: 6.4425e+09
AvailableMemory: 4.9872e+09
MultiprocessorCount: 30
ClockRateKHz: 1200000
ComputeMode: 'Default'
GPUOverlapsTransfers: 1
KernelExecutionTimeout: 1
CanMapHostMemory: 1
DeviceSupported: 1
DeviceAvailable: 1
DeviceSelected: 1
Any way to deal with this problem? Thank you very much.
2 Kommentare
Antworten (1)
Joss Knight
am 14 Aug. 2021
That is the documentation for DAGNetwork, not dlnetwork. dlnetwork does not have an ExecutionEnvironment, it chooses its environment in the same way that other GPU operations do, by reacting to the incoming data. As KSSV points out, converting to a gpuArray is the correct solution in this case.
0 Kommentare
Siehe auch
Kategorien
Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!