Parallel Job causing memory leak?

2 visualizaciones (últimos 30 días)
Chris DeVries
Chris DeVries el 22 de Ag. de 2011
I have converted a script into a parralel job with a pretty simple outline. Roughly this:
---------------------
sched = findResource('scheduler', 'type', 'local'); set(sched,'ClusterSize',6);
for a = 1:nLoops
job = createJob(sched);
% create 6 tasks, start schedular with script, etc.
% create sub-directories for each
% collect results
% kill job
end
-----------------
I have a 4 core machine with multi-threading and 12 GB of memory. Every time I run this script it eats more and more memory, until it crashes. Then I can't free up the memory unless I restart the computer (even leaving Matlab doesn't do it).
The script runs find outside the loop on a single core. Neve any issue that way. I see others have recently run into something similar. Is there a known issue with the Parallel computing toolbox? Am I doing something wrong?
Thanks!
Chris

Respuestas (1)

Jason Ross
Jason Ross el 23 de Ag. de 2011
When you kill the job, are you using destroy?
  2 comentarios
Chris DeVries
Chris DeVries el 23 de Ag. de 2011
Yes, I say:
destroy(job)
then the loop starts over at
job = createJob(sched)
The problem might be with a system command that I am calling in the script somewhere. I have to test that... but I think it's in the parallel computing toolbox.
Chris
Jason Ross
Jason Ross el 23 de Ag. de 2011
What's actually eating memory? If you look at Task Manager you'll see a few different MATLABs running. Do they continue to grow?
What version are you running?

Iniciar sesión para comentar.

Categorías

Más información sobre Communications Toolbox en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by