How to use MATLAB's neural network tool box for minibatch gradient descent?

4 Ansichten (letzte 30 Tage)
Hi,
I want to learn the functional relationship between a set of input-output pairs. Each input is a vector of length 500 and the output is a scalar value. I have 1 million such input output pairs and the disk space is not enough to train on this entire batch of data at once (using a GPU).
Is there a way to perform mini-batch training in matlab? This question has been asked in the past ( http://www.mathworks.com/matlabcentral/answers/254826-matlab-neural-network-mini-batch-training ) but there was no reply.
I am aware of the function "adapt", which updates the network with each incoming input-output pair, but I want to perform training in a mini-batch. Are there any options to do so using the MATLAB Neural Network toolbox?
Please help me out, Ekta

Akzeptierte Antwort

Greg Heath
Greg Heath am 18 Feb. 2016
True to his word, Dr. Heath has posted
http://www.mathworks.com/matlabcentral/newsreader/view_thread/344511#943659
Hope this helps
Thank you for formally accepting my answer
Greg
  2 Kommentare
Ekta Prashnani
Ekta Prashnani am 1 Mär. 2016
This works, but it is significantly slower on my machine even when I use a Titan X GPU. Not sure if this solution utilizes GPU acceleration.
Greg Heath
Greg Heath am 4 Mär. 2016
Maybe a good programmer can optimize the code. The logic is straightforward.
Good Luck,
Greg

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (1)

Greg Heath
Greg Heath am 17 Feb. 2016
There is no problem; Train in a loop. However, do not configure or initialize the net between the minibatches of training data.
Hope this helps.
Thank you for formally accepting my answer
Greg
  4 Kommentare
Ekta Prashnani
Ekta Prashnani am 17 Feb. 2016
Bearbeitet: Ekta Prashnani am 17 Feb. 2016
Thanks for your reply, Greg!
Sorry, I am confused: several online sources seem to suggest that "In training a neural net, the term epoch is used to describe a complete pass through all of the training patterns." (Source: http://www.cse.unsw.edu.au/~billw/mldict.html#epoch )
Some of the experts from Stanford are saying the same thing: "one epoch means that every example has been seen once." (Source: http://cs231n.github.io/neural-networks-3/, Section "Babysitting the learning process" )
Am I missing something in interpreting the text in the above sources? Or maybe the manner in which the epoch is defined in different in matlab?
I may be wrong, but I think the confusion here is in the difference between an epoch and an iteration. An iteration is completed every time the network parameters are updated (be it using the entire training data or mini batches of the training data). An epoch is completed when the network has passed through the entire training data once. (Source: http://deeplearning4j.org/glossary.html, please scroll down to "Epoch vs. Iteration" section)
Please correct me if I am wrong, looking forward to hearing back from you!
Greg Heath
Greg Heath am 18 Feb. 2016
No, it looks like I was wrong. As far as searching mathworks for minibatch info, I get the following number of hits
NEWSGROUP ANSWERS
minibatch 0 3
mini-batch 0 6 (includes above 3)
I will see if I can structure the guts of a naive minibatch code and post in the NEWSGROUP.
Greg

Melden Sie sich an, um zu kommentieren.

Kategorien

Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by