Error in testing_gan (line 54)
dlnetGenerator = dlnetwork(lgraphGenerator)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in dlnetwork (line 3)
imageInputLayer([64 64 1], 'Name', 'input', 'Mean', mean(XTrain,0))
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in mean (line 127)
y = sum(x, dim, flag) ./ mysize(x,dim);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error using sum
Invalid data type. First argument must be numeric or logical.

3 Kommentare

nasir mehmood
nasir mehmood am 24 Mai 2020
are u availiable , heip to solve
Output argument "b" (and maybe others) not assigned during call to "dlnetwork".
Error in bb (line 59)
dlnetGenerator = dlnetwork(lgraphGenerator);
Sophia Lloyd
Sophia Lloyd am 28 Jun. 2020
For me, the code in the example that you linked runs as expected.
It seems like you may have modified and saved the original example. There's no function or script named testing_gan in the original example.
From the errors you provided, the problem seems to be the mean value used in the imageInputLayer, which is causing the error inside dlnetwork when it initializes the layer.
imageInputLayer([64 64 1], 'Name', 'input', 'Mean', mean(XTrain,0))
I don’t know how you are providing XTrain, as that variable is not present in the example you linked. Is it a datastore? That would explain the error in sum inside mean.
It would help if you provide the exact code that is causing the error.

Melden Sie sich an, um zu kommentieren.

Antworten (1)

Mahmoud Afifi
Mahmoud Afifi am 23 Mai 2020

0 Stimmen

Can you please give a link to the original code? In meanwhile, have a look at this github page . It has several GANs with Matlab implementation.

5 Kommentare

nasir mehmood
nasir mehmood am 23 Mai 2020
your recommended example required CUDA,
generated error
Epoch 0
Unable to load CUDA driver. The library name used was nvcuda.dll. The error was:
The specified module could not be found.
Update or reinstall your GPU driver. For more information on GPU support, see GPU Support by Release.
Error in parallel.internal.gpu.isAnyDeviceSelected
Error in parallel.gpu.GPUDevice.isAvailable (line 119)
if parallel.internal.gpu.isAnyDeviceSelected
Error in canUseGPU (line 25)
ok = canUsePCT() && parallel.gpu.GPUDevice.isAvailable();
Error in bb (line 107)
dlZValidation = dlarray(ZValidation);
Mahmoud Afifi
Mahmoud Afifi am 24 Mai 2020
But in any example you need CUDA to be installed in your machine. Otherwise it is hard to train a GAN on a CPU.
nasir mehmood
nasir mehmood am 24 Mai 2020
Output argument "b" (and maybe others) not assigned during call to "dlnetwork".
dlnetGenerator = dlnetwork(lgraphGenerator);
Sophia Lloyd
Sophia Lloyd am 28 Jun. 2020
It is possible to train a GAN on a CPU, though usually not recommended as it will be very slow.
The example https://www.mathworks.com/help/deeplearning/ug/train-generative-adversarial-network.html will run on the CPU there is no GPU available.
The examples in the GitHub page assume that you have a GPU. If you do not, you need to modify the code and remove the call to gpuArray. This should be enough to run the code on the CPU.
If you do have a supported GPU, you need a suitable driver for your device and platform. We recommend you use the most up-to date driver for your device. You can get drivers from NVIDIA here: https://www.nvidia.com/Download/index.aspx. You can check if your GPU is supported here: https://www.mathworks.com/help/parallel-computing/gpu-support-by-release.html
To use the GPU for training, you only need the driver. You do not need to install the CUDA Toolkit.

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Deep Learning Toolbox finden Sie in Hilfe-Center und File Exchange

Gefragt:

am 23 Mai 2020

Kommentiert:

am 28 Jun. 2020

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by