How to solve out of memory error?

1 Ansicht (letzte 30 Tage)
Minu
Minu am 7 Mai 2013
I am doing my project in OCR.For this i am using image size of 64x64 because when i tried 32x32 etc some pixels is lost.I have tried features such as zonal density,Zernike moments,Projection Histogram,distance profile,Crossing .The main problem is feature vector size is too big .I have take the combination of above features and tried.But whenever i train the neural network ,i have got an error "out of memory".I have tried pca dimensionality reduction but its not work good.i didnt get efficency during training.Run the code in my pc and laptop.In both of them i have got same error.my RAM is 2GB.so i think about reducing the size of an image.is there any solution to solve this problem.
I have one more problem whenever i tried to train the neural network using same features result is varied.how to solve this also?

Akzeptierte Antwort

Greg Heath
Greg Heath am 7 Mai 2013
Of course pixels are lost when you reduce the size. I am not an expert in imagery, therefore I cannot confidently suggest another method. However, there must be several acceptable ones available. Why don't you submit a post on image feature reduction?
The NNs in the NNTBX randomly divide data and randomly initialize net weights. If you want to reproduce a design, set the RNG to the same initial state as before.
Hope this helps.
Thank you for formally accepting my answer
Greg

Weitere Antworten (1)

Jan
Jan am 7 Mai 2013
What about installing more RAM?

Kategorien

Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by