Why mini-batch size dosn't make any diferance in trining speed of my neural network?

2 Ansichten (letzte 30 Tage)
I have created a neural network just to do some benchmarks and it seems that the training time is not effected by the MiniBatchSize option.
I have tried the following code with batch=32 and batch=1000.
I have 1491 data sequences .
the length is 10 for each sequence of the training data.
I am getting 60 sec for training.(for 500 epochs)
I have tried the same architecture in python using keras with tensorflow. my results there have changed significantly when I have changed the batch size. 17 sec for batch=1000 compared to 140 sec with batch=32.
I am also getting the same training time in matlab regardless of training algorithm ('sgdm' / 'rmsprop' / 'adam')
Why is it happening?
am I doing something wrong?
inputSize = 10;
lstm_neurons=100;
maxEpochs = 500;
batch=32;
layers = [ ...
sequenceInputLayer(finaal_inputSize)
fullyConnectedLayer(finaal_inputSize)
lstmLayer(lstm_neurons,'OutputMode','sequence')
lstmLayer(lstm_neurons,'OutputMode','sequence')
fullyConnectedLayer(1)
regressionLayer];
options = trainingOptions('adam', ...
'ExecutionEnvironment','gpu', ...
'MaxEpochs',maxEpochs, ...
'Verbose',0, ...
'InitialLearnRate' ,learningrate , ...
'MiniBatchSize',batch);
temp_nets= trainNetwork(x_train,y_train,layers,options);
  1 Kommentar
Tran Vinh
Tran Vinh am 9 Dez. 2019
Hi J R,
Did you find a solution yet? I also got this issue. If you or anyone found the solutions, pls share :D
Thank you

Melden Sie sich an, um zu kommentieren.

Antworten (0)

Kategorien

Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange

Produkte


Version

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by