GTX1060 for deep learning semantic image segmentation
4 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Hello!
I am attempting to train SegNet for semantic segmentation following the example set here: https://www.mathworks.com/examples/matlab/community/24778-semantic-segmentation-using-deep-learning
However, I continue to run into an out of memory error. I am wondering if the error is in my code, or if my GTX1060 3gb GPU is simply not powerful enough to train SegNet as in the example. I have already reduced the mini-batch size to 1, so I'm unsure if there are any other fixes I can make.
Thanks!
0 Kommentare
Antworten (1)
Joss Knight
am 23 Jan. 2018
Bearbeitet: Joss Knight
am 23 Jan. 2018
Yes, 3GB isn't enough for this example, sorry. SegNet is just too high resolution a network. You could try training on the CPU. Alternatively, it could be enough if you were not also driving the display with your 1060.
0 Kommentare
Siehe auch
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!