Best way to deal with large data for deep learning?
3 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Hi, I have been trying image classification with CNNs. I have some 350,000 images that I read and stored in a 4D matrix of size (170 x 170 x 3 x 350,000) in a data.mat file. I used matfile to keep adding new images to my data.mat file. The resultant file is almost 20GB
The problem now is that I cannot access the saved images because I run out of memory.
Do anyone have any suggestions for more efficient ways to build large data for deep learning?
One solution I can apply is to split the data and train two networks one with weights initialized by the others final weights, but I don't want to take that route!
2 Kommentare
KSSV
am 22 Jun. 2016
You want to process the whole data (170 x 170 x 3 x 350,000) at once or you are using only one matrix (170X170X3) at one step?
Akzeptierte Antwort
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Recognition, Object Detection, and Semantic Segmentation finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!