What to do when you really ARE out of memory?
Info
Diese Frage ist geschlossen. Öffnen Sie sie erneut, um sie zu bearbeiten oder zu beantworten.
Ältere Kommentare anzeigen
What is the solution for optimizing code when you really are just working with too large of a dataset?
Currently I need to perform triscatteredinterp using 3 vectors all (100,000,000 x 1).
scatteredInterpolant does not work any better in this instance.
Antworten (3)
the cyclist
am 4 Aug. 2015
Bearbeitet: the cyclist
am 4 Aug. 2015
2 Stimmen
For very large datasets, processing a random sample of the data will often give satisfactory results.
Walter Roberson
am 4 Aug. 2015
1 Stimme
Store the data in hierarchies such as octrees that allow you to extract a subset that fits within working memory to do the fine-grained work on.
Robert Jenkins
am 7 Aug. 2015
1 Stimme
Diese Frage ist geschlossen.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!