Is memory reduction not possible when training a neural network on a GPU?

4 Ansichten (letzte 30 Tage)
PetterS
PetterS am 3 Aug. 2015
Kommentiert: PetterS am 5 Aug. 2015
When dealing with large datasets the amount of temporary storage needed during neural network training can be reduced by adding the syntax: “ 'reduction',N” to the training command.
This is a neat feature and would be particularly useful when training nets on graphics cards that generally have less memory compared to the system RAM, but when I add the command to a training done on my GPU the reduction seems to have no effect at all. It doesn’t produce any warning or error message, is simply does not reduce the memory usage of my GPU.
Is this feature not available with GPU training or do I need to do something different in order to make it work?
Thanks.

Antworten (1)

Nick Hobbs
Nick Hobbs am 5 Aug. 2015
The documentation for the train function says that 'reduction' might be able to help with memory only if a MATLAB calculation is being used by the train function. It is possible that the train function is calling a MEX file, and then the reduction likely will not provide much change. This can be determined with 'showResources'. More information on 'showResources' can be found here.
  1 Kommentar
PetterS
PetterS am 5 Aug. 2015
Yes, I’ve seen that article. But when I do the training on my GPU the resource isn’t Matlab nor MEX, it is simply reported as “CUDA”. I don’t know if CUDA calculations take place in Matlab or in MEX or if it counts as neither.

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by