Folding batch normalization into preceding convolution

15 Ansichten (letzte 30 Tage)
Kai Tan
Kai Tan am 17 Feb. 2019
Trying to port a trained MATLAB CNN models to another framework. I would like to get rid of batch norm (BN) layers by folding the parameters into the preceding convolution layers. I use the following formulation:
% layer is BatchNorm layer
m = layer.TrainedMean;
v = layer.TrainedVariance;
offset = layer.Offset;
scale = layer.Scale;
ep = layer.Epsilon;
% adjust the weights:
for j = 1:size(scale, 3) % the number of output channels
denom = sqrt(v(1,1,j)+ep);
% adjust the 4D convolution weights
new_w(:,:,:,j) = scale(1,1,j)*w(:,:,:,j)/denom;
% adjust the convolution biases
new_b(1,1,j) = scale(1,1,j)*(b(1,1,j)-m(1,1,j))/denom + offset(1,1,j);
end
However, when I compare the outputs of the BN layer to the outputs of the convolution layer, where the BN parameters were folded into, I am getting different results.
Left (BN layer), Right (Conv Layer with BN parameters folded).
untitled.jpg
Has anyone successfully done this?

Antworten (0)

Kategorien

Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange

Produkte


Version

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by