Deep Learning Layers, incorrect output
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
I have created the following layer. Outside the layer, I have tried that if I have input1 size of [68,68,1], input2 size of [68,1,3] and input3 size of [68,17,3], I should be able to get back Z1 of size [17,17,1] and Z2 of size [17,1,3]. But instead when I try to analyze the layer, it gave me back scalars for Z1 and Z2 (out1 and out2 both equals to 17), I don't know what I have done wrong.
classdef SpectralPoolingLayer < nnet.layer.Layer
% Example custom weighted addition layer.
properties
end
properties (Learnable)
% Layer learnable parameters
% Scaling coefficient
end
methods
function layer = SpectralPoolingLayer(numInputs,numOutputs,name)
% layer = weightedAdditionLayer(numInputs,name) creates a
% weighted addition layer and specifies the number of inputs
% and the layer name.
% Set number of inputs.
layer.NumInputs = numInputs;
layer.NumOutputs = numOutputs;
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = "SpectralPooling of " + numInputs + ...
" inputs";
% Initialize layer weights.
end
function [Z1,Z2] = predict(~, X1,X2,X3)
% Z = predict(layer, X1, ..., Xn) forwards the input data X1,
% ..., Xn through the layer and outputs the result Z.
% Initialize output
A = X1(:,:,1);
P = X3(:,:,1);
sz3 = size(X3);
Z1 = zeros([sz3(2),sz3(2),1],'like',X3);
sz = size(X2);
Z2 = zeros([sz3(2),1,sz(3)],'like',X2);
%start Implementing
Z1(:,:,1) = P'*A*P;
Z1 = max(0,Z1);
for i = 1:sz(3)
Z2(:,1,i) = P'*X2(:,1,i);
end
end
end
end
0 Kommentare
Antworten (0)
Siehe auch
Kategorien
Mehr zu Build Deep Neural Networks finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!