Filter löschen
Filter löschen

Large data file I/O

4 Ansichten (letzte 30 Tage)
Paul
Paul am 17 Jan. 2012
I am trying to speed up a bottle neck in our code. Currently, the output of one of our FORTRAN modules writes the data to a text file. The text file is saved as .m file. The m files is then loaded into MATLAB. An example would be
function x = my_data(x)
x.time = [LARGE AMOUNT OF DATA]
Now this read operation seems to cause MATLAB to run out of memory. The MATLAB help file suggests storing large amounts of data in MAT files because they are optimized for read/write operations and compress data as well. They say this is better than using low-level file I/O such as fopen. But since our data is being read as an “.m” is it still using such file I/O. My question is should we take the time to try to write that data from FORTRAN as a mat file instead of a “.m”.

Akzeptierte Antwort

Walter Roberson
Walter Roberson am 17 Jan. 2012
It appears to me that you would be better off writing a binary file. The speed might not be as good as a .mat file (because of no compression), but Fortran should have no problem writing a binary file whereas bringing in the .mat format can be a nuisance.
I would also suggest that it would be faster if you used
x.time = my_data(); %no argument
and
function times = my_data
times = fread(...);
end
This would avoid having to make a copy of x.
  1 Kommentar
Paul
Paul am 17 Jan. 2012
Thanks for the suggestion. I will give it a try. You are absolutely right about .mat being a nuisance. I don't want to re-write and re-compile our FORTRAN code to output .mat. However, changing the output to binary may work as it is a smaller change in the FOTRAN code.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Fortran with MATLAB finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by