Exclusively Utilizing NVIDIA GeForce RTX GPU for MATLAB UNet Model Training: Issue with GPU Selection

2 Ansichten (letzte 30 Tage)
Hi,
I am using MATLAB to train a UNet model for semantic segmentation purposes on my desktop computer running Windows 11. My computer is equipped with a CPU, GPU 0 (Intel(R) UHD Graphics 770), and GPU 1 (NVIDIA GeForce RTX 3070).
My goal is to exclusively utilize GPU 1 (NVIDIA GeForce RTX 3070) for the training process. To ensure this, I have set the 'ExecutionEnvironment' option in my training parameters to 'gpu'.
However, during training, I've noticed that GPU 1's usage remains at 0%, while the CPU's usage is considerably high. Even when I use the delete(gcp('nocreate')) command in my code, GPU 0 (Intel(R) UHD Graphics 770) only exhibits minor activity at around 1% or 5% usage.
I'm seeking guidance on how to resolve this issue and ensure that my UNet model is trained exclusively using GPU 1. Is there a specific configuration or step that I might be missing? Your assistance in resolving this matter would be greatly appreciated.
  7 Kommentare
Sam Marshalik
Sam Marshalik am 5 Sep. 2023
@Gobert, let's confirm that you are able to use the GPU device in general. Can you try something like this:
a = rand(100);
aGPU = gpuArray(a);
fft(aGPU)
Does the above command run successfully? Can you try bumping up the size of 'a' and see if you can see your GPU doing something in Task Manager?

Melden Sie sich an, um zu kommentieren.

Antworten (0)

Kategorien

Mehr zu Parallel and Cloud finden Sie in Help Center und File Exchange

Produkte


Version

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by