implement matrix with 2 million row in MATLAB!
9 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
hi
I have a matrix(that grows in my code) . I konw that in last section, the matrix will have 2 million Rows and 9000 columns. But , MATLAB cannot calculate that , and after some hours say : "out of memory . Type Help Memory for Your option"
i used sparse matrix or int32,int16 and int8 .. but i saw the error again! what can i do ? it's very important to me.
the memory(RAM)is 6 GB.
0 Kommentare
Antworten (2)
John D'Errico
am 13 Nov. 2014
You want to create a huge matrix. It is not sparse, else you would have said that most of the elements are zeros. Sparse won't help you otherwise, and for there to be any gain at all, you want only a small percentage of the values to be non-zero.
How big is the matrix? This is a computation you should do yourself!
2e6*9000 = 1.8e10
So roughly 18 billion elements. If they are doubles, then each element uses 8 bytes. Therefore that matrix will require roughly 144 gigabytes of memory. How can you be surprised if you ran out of memory?
Even for a uint8 array, it will still require 18 gigabytes of RAM. You have 6 GB, of which MATLAB will use only a fraction of that for array storage.
You don't say which version of MATLAB you are using. Is this a 64 bit version?
People think their computers are infinitely large and infinitely fast. Computers have a finite amount of RAM. Sorry, but they do. If you want to work with huge arrays, then get more memory. Use the 64 bit version of MATLAB for your computer, so it can address all that memory. Note that even if you have 64 bit MATLAB, trying to work with that huge of an array will be incredibly slow UNLESS you have sufficient RAM. Big problems require big computers.
6 Kommentare
Oleg Komarov
am 17 Nov. 2014
MapReduce trades-off I/O operations on disk with memory. This means that the datastore will point to a collection of files on you hard drive. The MapReduce pulls small blocks that fit in memory, applies the mapping and a first consolidation/reduction, and then finally reduces the mid results to the final statistic. So, I recommend to read the docs more carefully. There is no way out with big data, it is tedious and not simple to grasp at first. I cannot simply write an example because it is not simpler than the doc already show. So, I am not gonna reinvent the wheel.
matt dash
am 17 Nov. 2014
Take a deep breath, and think about whether you REALLY need that matrix. At 144Gb, basically NO ONE'S computer could handle this data and still have enough memory to do something useful with it. (Figure 256Gb is about the top of the top end that you're likely to come across in nice computers today) And yet, we've gotten pretty far as a society. So somehow we're all getting by with smaller matrices. Instead of taking for granted that you NEED this matrix, try explaining the problem you're trying to solve and maybe people can help suggest ways you can accomplish it without needing such a large matrix.
(Also you don't want a big matrix to be growing in your code. If you know how big it needs to be at the end, just preallocate it to that size so you get the out of memory error right away. No point in waiting several hours just to run out of memory.)
0 Kommentare
Siehe auch
Kategorien
Mehr zu Large Files and Big Data finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!