How to add batch normalization layers in between Con. layer and RELU layer in Googlenet.? Suggestions to improve accuracy..

2 Ansichten (letzte 30 Tage)
How to add Batch normalization layer in google net in matlab.? The layers are like this.
layers.png
I want to add batchnormalization layer in between conv. layer and ReLU lauer.? This is for image classification task, so do i need to add batchnormalization layer once or after each conv. layer.? For replacing once this piece of code works. But if needed to add after each conv. layer how to do it.?
larray = [batchNormalizationLayer('Name','BN1')
leakyReluLayer('Name','leakyRelu_1','Scale',0.1)];
lgraph = replaceLayer(lgraph,'conv1-relu_7x7',larray);
Accuracy is 65, i need to improve it. Followed the below training options.? Apart from batch normalization layer, chaning any parameter value will the accuracy increases.? Experts suggestions.?
miniBatchSize = 10;
valFrequency = floor(numel(augimdsTrain.Files)/miniBatchSize);
options = trainingOptions('sgdm', ...
'MiniBatchSize',miniBatchSize, ...
'MaxEpochs',7, ...
'InitialLearnRate',3e-4, ...
'Shuffle','every-epoch', ...
'ValidationData',augimdsValidation, ...
'ValidationFrequency',valFrequency, ...
'Verbose',false, ...
'Plots','training-progress');
googlenet.png

Antworten (1)

Sourav Bairagya
Sourav Bairagya am 14 Feb. 2020
To add new layers in layergraph object, first add the new layer using 'addLayer' function. Then, you can use 'connectLayers' function to connect then in the layergraph object.. You may leverage this links:
To improve accuracy you can opt for different optimizers, can chnage mini-batch size, epoch and learning rates in 'trainingOptions'.
  1 Kommentar
Karthik K
Karthik K am 17 Feb. 2020
I tried this way.
b1=batchNormalizationLayer('Name','BN1');
b2=batchNormalizationLayer;
lgraph = layerGraph;
lgraph = addLayers(lgraph,BN1);
lgraph = addLayers(lgraph,BN2);
lgraph = connectLayers(lgraph,'BN1','add_1/in1');
lgraph = connectLayers(lgraph,'BN2','add_1/in2');
plot(lgraph)
it gives me error saying, Unrecognized function or variable 'BN1'.
Can you show me here, just adding a bacth normalization layer between Conv. and RELU layer at 7 & 8, 9 & 10. So that i will get the idea.

Melden Sie sich an, um zu kommentieren.

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by