How read extremely large HDF5 format data and resolve out of memory issue?
14 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I need to read several datasets from a 2TB HDF5 file which are needed for further computation.
If I simply coding as following,
varible1=h5read('path to .h5 file', 'path to dataset')
It would require ~500 GB array memory.
Is there any good way to solve this problem?
Thanks!
1 Kommentar
Antworten (1)
ROSEMARIE MURRAY
am 3 Mai 2022
You could use a fileDatastore with the read function h5read, which would allow you to specify a certain amount to read at a time.
Or you could try this datastore: https://www.mathworks.com/matlabcentral/fileexchange/64919-hdf5-custom-file-datastore-for-timeseries-in-matlab
0 Kommentare
Siehe auch
Kategorien
Mehr zu HDF5 finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!