- "something similar to the 'matfile' command with hdf5" No.
- "Alternatively, is there a way to do something along the lines of (in pseudocode):" Yes, that's straightforward.
Training a convolutional neural network with matconvnet using an hdf5 file.
8 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Samuel Spencer
am 25 Okt. 2017
Kommentiert: sally
am 14 Jan. 2018
Hi,
I have a large dataset (~1gb, hopefully with expansion to 100gb) stored hierarchically in an hdf5 file which I'd like to use with neural networks, specifically the MatConvNet package with MexConv3D (and R2016b). Ideally, I'd like to not have to load the entire file into memory, so is there a way of achieving something similar to the 'matfile' command with hdf5? Alternatively, is there a way to do something along the lines of (in pseudocode):
for all images in file:
image=hdfread('/Path/to/dataentry') #Load in one image
train_neural_net(image) #Do bit of training needed just on one image in the datafile.
deallocate(image) #Wipe image from memory, keeping just the net
Many thanks in advance
2 Kommentare
per isakson
am 26 Okt. 2017
sally
am 14 Jan. 2018
an unrelated comment to the answer.. but a question please, have u used the MexConv3D package as a CNN for 3D input images? if so, how to train this algorithm on a large dataset of 3D images? how to update the weights..
Akzeptierte Antwort
Amy
am 27 Okt. 2017
Hi Samuel,
There is not a way to access the data in an hdf5 file without loading it into memory, but there are examples of reading in subsets of data using 'h5read' in its documentation.
It is definitely possibly to train your network in a loop as you describe, loading and unloading the training data for each loop iteration. There are a few tricks you have to use to get this to work:
- You have to make sure that the network is configured only to use input data as training data.
- You have to use the training algorithm “traingd”.
Note that doing the training this way is very slow (switching off the GUI might help speed it up). You may not want to train for each individual image, but rather for smaller batches of images.
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu HDF5 finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!