Changing large matrices by not completely loading them into memory

Hi,
I'm attempting to modify very large matrices (single, 50e3 x 50e3), which don't make sense to load into the memory. I was wondering what you could recommend me as a data handling strategy? I thought ideally I could always load a let us say 100x100 square modify it and write it back. My working machine uses a SSD connected via M2 so it should be relatively speedy (however of course not nearly as fast as RAM). What suggestions do you have?
Thanks,
Moritz

Antworten (2)

Stephen23
Stephen23 am 18 Jun. 2015
Bearbeitet: Stephen23 am 18 Jun. 2015

1 Stimme

You should read TMW's own advice on working with big data:
And in particular you might find memmapfile to be of significant interest to you:

1 Kommentar

Or instead of memmapfile, save the .mat with -v7.3 and then use matFile objects to read in portions of the array.

Melden Sie sich an, um zu kommentieren.

Alessandro
Alessandro am 18 Jun. 2015

0 Stimmen

Did you check the sparse command out?

1 Kommentar

Yes I did. However, I believe this only works if you have a considerable amount of zero elements. In my case however, the amount of zero elements are < 5%.

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Large Files and Big Data finden Sie in Hilfe-Center und File Exchange

Produkte

Gefragt:

am 18 Jun. 2015

Kommentiert:

am 18 Jun. 2015

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by