Deep Learning Custom Layer learning parameters update
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Mathieu Chêne
am 12 Jan. 2022
Kommentiert: Mathieu Chêne
am 14 Jan. 2022
Hello,
I am working on a deep Learning project In which I try to classify data from a csv. I tryed to use a custom layer but when I train the network my Loss Function seems "constant" as if the weight is not updated.![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/860415/image.png)
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/860415/image.png)
Do you know what could be the reason of this behavior ?
I am sure of my dataset because when I use a fullyConnected Layer instead of my custom layer the training works perfectly and the testing gives me 100% accuracy.
I also give you the predict and the backward function from my custom layer where Weight is a learning parameter:
function Z = predict(layer, X)
% Z = predict(layer, X1, ..., Xn) forwards the input data X1,
% ..., Xn through the layer and outputs the result Z.
W = layer.Weights;
numel=size(X,2);
% Initialize output
Z = zeros(layer.OutputSize,numel,"single");
% Weighted addition
for k=1:numel
for j=1:layer.OutputSize
for i = 1:layer.InputSize
Z(j,k) = Z(j,k) + W(j,i)*X(i,k);
end
end
end
end
function [dLdX,dLdWeight]=backward(layer,X,~,dLdZ,~)
%Initialization
W=layer.Weights;
dLdWeight=zeros(size(W),"single");
dLdX=zeros(size(X),"single");
%Backward operation
for k=1:size(X,2)
for j=1:layer.OutputSize
for i=1:layer.InputSize
dLdWeight(j,i)=dLdWeight(j,i)+X(i,k)*dLdZ(j,k);
dLdX(i,k)=dLdX(i,k)+W(j,i)*dLdZ(j,k);
end
end
end
end
Thank you in advance for your futur help.
Mathieu
0 Kommentare
Akzeptierte Antwort
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Image Data Workflows finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!