Is map reduce suitable to analyze large data set that has an iterative function?
6 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Currently, I have a large timetable that is more than 1 billion rows x 3 columns.
Some of the highlighted functions I use include:
unstack: which turns my timetables into 1 billion rows x 1000 columns.
fillmissing(data, 'previous'): which fills all the NaN values from the previous value.
retime: which in some cases, can increase my number of rows 10 fold.
cumsum: add all the previous data together.
I am able to process small datasets using standard matlab function. But for some of the larger dataset (> 1 billion rows). I run into memory issues.
I am planning to break my timetable into smaller pieces, record all the "states" at the end of each section, and repopulate at the beginning of the next batch.
Can map reduce help me in this situation?
Any pseudo code is appreciated. Thank you
4 Kommentare
Ive J
am 10 Sep. 2021
Regardless, in general you can use mapreduce , something like:
ds = datastore(...);
raw = mapreduce(ds, @myMapper, @myReducer);
raw = readall(raw); % may not fit into memory (maybe tall?)
function myMapper(data, intermKeys, intermKVStore)
data = fillmissing(data, 'previous');
% other filters go here
% do whatever
offsetData = intermKeys.Offset;
add(intermKVStore, offsetData, data)
end
function myReducer(intermKey, intermValsIter, outKVStore)
data = [];
while(hasnext(intermValsIter))
data = [data; getnext(intermValsIter)];
end
add(outKVStore,intermKey,data);
end
However, you should be careful about fillmissing (or maybe unstack too) in cases where first rows in some chunks are missing (so you don't have access to the previous rows because they're in another chunk). So, this approach is good if chunks can be treated somehow independ of each other.
Antworten (0)
Siehe auch
Kategorien
Mehr zu Standard File Formats finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!