Out of Memory error when resizing 3D volumes

4 Ansichten (letzte 30 Tage)
Stephen Macknik
Stephen Macknik am 9 Dez. 2022
Bearbeitet: Stephen Macknik am 21 Dez. 2022
I have a Lung CT volume of size 512 X 512 X 264 with a pixel spacing of 0.625 mm and a slice thickness of 1.25 mm. Each slice was displayed at a size of 2080 X 2080 on a screen. A radiologist's eyes were tracked while they scrolled through the slices in the axial view in search of a nodule. Eye positions were mapped onto a 3D array of size 2080 X 2080 X 264. Since the voxel resolution of the original Lung CT volume is 512 X 512 X (264*1.25)/0.625 = 512 X 512 X 528, and the scaling factor 2080/512 = 4.0625, the voxel resolution of the displayed volume is 2080 X 2080 X (528*4.0625) = 2080 X 2080 X 2145. When I try resizing the eye position map of size 2080 X 2080 X 264 to 2080 X 2080 X 2145 using imresize3, I run out of memory. How must I go about fixing this issue?

Akzeptierte Antwort

John D'Errico
John D'Errico am 9 Dez. 2022
Bearbeitet: John D'Errico am 9 Dez. 2022
Simple. If you are running out of memory, then just get more memory.
An image of that size
2080*2080*2145
ans = 9.2801e+09
has almost 10 billion pixels. If it is a 3 channel color image, then 30 billion elements. If they are uint8 elements, then you will need 30 GB of memory to perform this operation, but I would always figure on needing at least twice the memory of the largest array you are working with. But if the elements are scaled 0-1, then they will be doubles, so you would need 8 times as much memory.
Sorry. Memory is cheap. You need to get some if you will do this operation and store the entire array in memory.
Could you completely change your code? Well, yes. You display one slice of the image, then swap to display the next slice. Could you just compute the indicated slice as needed, on the fly? Of course. Will that slow down the time between slices displayed? Possibly.
So it looks like you have two main choices. Upgrade your computer memory, or rewrite your code.
Computers are still not infinitely large or infinitely fast (except in the movies and on TV.) What works for a small problem often fails on a larger problem.
  1 Kommentar
Stephen Macknik
Stephen Macknik am 21 Dez. 2022
Bearbeitet: Stephen Macknik am 21 Dez. 2022
@John D'Errico @Jan Thank you so much for the suggestions! Converting to uint8 does solve the memory issue. However, I plan to convolve the 2080 X 2080 X 2145 array of fixation positions with a gaussian filter of size 267 X 267 X 267 using imgaussfilt3 to produce a fixation heat map. This requires the datatype of the fixation map to be double or at least single and convolving the gaussian filter with the fixation map in either of these datatypes still results in an out of memory error.
I did increase the RAM on my system from 64 GB to 128 GB. However, only 64 GB out of 128 remains usable. This seems like a system issue. I'm guessing seasoned coders who perform transformations on such large datasets face similar issues as well? I tried a few standard fixes like 1) unchecking the maximum memory option in the system configuration settings, 2) turning off virtual memory through Advanced System Settings, 3) updating BIOS, and none of these have freed up all the RAM. Any suggestions?

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (1)

Jan
Jan am 9 Dez. 2022
If the resized image is stored as uint8, you need 2080 * 2080 * 2145 = 9.28 GB of free RAM. For images in double format you need 74.2 GB already.
The direct solution is to install more RAM.

Kategorien

Mehr zu Image Processing Toolbox finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by