doubts about parfeval memory allocation

5 Ansichten (letzte 30 Tage)
Daniel Vieira
Daniel Vieira am 5 Mär. 2024
Kommentiert: Edric Ellis am 6 Mär. 2024
I have a doubt about how parfeval handles memory. My code is something like:
ds=datastore('some/path/*.extension');
T=readall(ds); % T is a very large table, but it fits memory
N=20e3;
R=cell(N,1);
for n=1:N
subT=makeSubset(T,n); % subT is a manageable subset of T
R{n}=calculateSomething(subT); % self-explanatory
end
If I just parfor that, it will cause an out of memory error (I tried). My guess is that it is trying to copy T for all workers, and T barely fits memory once, it definitely doesn't fit 24 times.
Then I thought of something like:
ds=datastore('some/path/*.extension');
T=readall(ds); % T is a very large table, but it fits memory
N=20e3;
for n=N:-1:1
subT=makeSubset(T,n); % subT is a manageable subset of T
results(n)=parfeval(@(x) calculateSomething(x),1,subT);
end
But I'm in doubt how does parfeval handles memory. It will start a 20k jobs queue, each with a piece of T, how do I know I won't go OoM or crash? Is memory for the job allocated on its creation, or on its execution? Is there a way to know, through the code, how much memory it's taking (so that I can pause job creation and wait for results instead of letting crash)?
  2 Kommentare
Walter Roberson
Walter Roberson am 5 Mär. 2024
Have you considered using the background pool? That should cut down on memory copying .
Daniel Vieira
Daniel Vieira am 5 Mär. 2024
I'm not sure I understand the suggestion, I thought parfeval already used the background pool... How does it cut down the copying? If I run the code like that I expect to see 20k jobs each taking a chunk of memory until they are all done, is it not what's going to happen?
Or are you suggesting something else entirely?

Melden Sie sich an, um zu kommentieren.

Akzeptierte Antwort

Edric Ellis
Edric Ellis am 6 Mär. 2024
If the entire dataset only just fits in memory once, then a better option than readall might be to use partition on the datastore to divide the work up that way. This example shows you how to process data from a datastore in parallel by getting each worker to read only the portions it needs into memory.
(In your parfeval case, the bare minimum memory that you need is at least twice the size of T, since the Future objects returned by parfeval will be storing a copy of the subset of T that they're going to operate on. By dividing things up in to very small pieces as per your example, then the peak memory usage should not be a huge margin more than that - each running worker only needs the inputs for one parfeval request in memory at any time. If you use a thread-based parallel pool, then this can be more efficient in terms of memory than a process-based pool)
  2 Kommentare
Daniel Vieira
Daniel Vieira am 6 Mär. 2024
thanks... I can't really escape readall, there's a bit of preprocessing on T just after reading it (omitted in the code example above) that require the entire table.
I tried using a background pool last night, it was running slower than a normal for without parallelizing. Memory usage was near 100% plus a chunck of cache, it was probably just as you said, it needs 2x T in memory. I'll try a thread-based pool tonight, maybe it works better.
Edric Ellis
Edric Ellis am 6 Mär. 2024
backgroundPool is a thread-based pool. It's essentially the same sort of thing as parpool("Threads"). So, if you had problems there, things won't change with parpool("Threads").
Another completely different option, depending heavily on the operations you need to perform on your table, is to use tall arrays. These support both thread and process parallelism, and can help when your data is too large to fit in memory because they operate in a piece-wise manner (but can still perform "whole table" operations).

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Parallel Computing Fundamentals finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by