Why Interactive MATLAB job require less memory compared to non-interctive job on cluster
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
Hi everyone
I am running matlab on school's cluster (linux system). The original data read into matlab is up to 4 GB, and there is also a array needs 24 GB for calculation in my code. I required 12 cores and 24 GB memory by this command (qsh -pe smp 12 -l h_vmemm=2 matlab=12) for Interactive MATLAB job on school's cluster. The job can run successfully.
However, I required 12 cores with 50 GB for non-interctive job, but it failed somewhere of my code. Then I increased the memory to 80 GB, it can run further.But it would stop as well. Even I used clear command to clear the big arrays, it did not work!
Can any one tell me what is wrong for the non-interctive job?
2 Kommentare
Kojiro Saito
am 13 Jan. 2018
What a function do you use for non-interactive job? parfor, batch or spmd? One point is that there's a transparency concern in parfor so, please take a look at this document of Transparency in parfor.
Antworten (0)
Siehe auch
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!