How to avoid memory problem while processing huge table?

2 Ansichten (letzte 30 Tage)
Nitinkumar Ambekar
Nitinkumar Ambekar am 31 Aug. 2016
I have a huge observation table with around 30 Lacs of rows and 12 columns. While training knn classifier in 2016a version, I am getting errors related to memory. Is there any way to avoid this? I have tried to reduce rows but it's affecting the output quality.
Each row in table is a pixel and it's other values as features in columns. In one set of MRI scan, there are around 20 images of 512x512, I am loading one set for creating observation table. Is there another way to pass large amount of data to knn classifier?

Antworten (1)

KSSV
KSSV am 31 Aug. 2016
doc datastore, memmap, mapreduce.
  1 Kommentar
Nitinkumar Ambekar
Nitinkumar Ambekar am 1 Sep. 2016
Thanks @Dr. Siva, one small query: Can I pass one of these to a function which takes `table` or `matrix`?

Melden Sie sich an, um zu kommentieren.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by