Parallel Job causing memory leak?

I have converted a script into a parralel job with a pretty simple outline. Roughly this:
---------------------
sched = findResource('scheduler', 'type', 'local'); set(sched,'ClusterSize',6);
for a = 1:nLoops
job = createJob(sched);
% create 6 tasks, start schedular with script, etc.
% create sub-directories for each
% collect results
% kill job
end
-----------------
I have a 4 core machine with multi-threading and 12 GB of memory. Every time I run this script it eats more and more memory, until it crashes. Then I can't free up the memory unless I restart the computer (even leaving Matlab doesn't do it).
The script runs find outside the loop on a single core. Neve any issue that way. I see others have recently run into something similar. Is there a known issue with the Parallel computing toolbox? Am I doing something wrong?
Thanks!
Chris

Antworten (1)

Jason Ross
Jason Ross am 23 Aug. 2011

0 Stimmen

When you kill the job, are you using destroy?

2 Kommentare

Chris DeVries
Chris DeVries am 23 Aug. 2011
Yes, I say:
destroy(job)
then the loop starts over at
job = createJob(sched)
The problem might be with a system command that I am calling in the script somewhere. I have to test that... but I think it's in the parallel computing toolbox.
Chris
Jason Ross
Jason Ross am 23 Aug. 2011
What's actually eating memory? If you look at Task Manager you'll see a few different MATLABs running. Do they continue to grow?
What version are you running?

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Communications Toolbox finden Sie in Hilfe-Center und File Exchange

Gefragt:

am 22 Aug. 2011

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by