how to update weight in neural network using Error back propagation algorithm?
9 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Good Morning, I solve the problem using error back-propagation algorithm for classification of imbalanced data. In my work i have do Ann-thyroid data is transformed into two -class problems."Ann-thyroid13(23)" refers to a class1(2) is the minority class while class 3is treated as the majority class.My question is how to update weight in my problem. x=textread('D:\UCI\thyroid-disease\ann-train.data');%read whole document % retrieve the last coloum value lst=x(:,22); [m n]=size(x); k=1;%store Minority class1 for i=1 : m if (x(i,22)==1) a(k,:)=x(i,:); k=k+1; end end [m1 n1]=size(a);
k1=1;%store Majority class3
for i=1:m
if(x(i,22)==3)
a1(k1,:)=x(i,:);
k1=k1+1;
end
end
[m2 n1]=size(a1);
k2=1;%store Minority class2
for i=1:m
if(x(i,22)==2)
a2(k2,:)=x(i,:);
k2=k2+1;
end
end
[m3 n1]=size(a2);
ann13=vertcat(a,a1);
ann23=vertcat(a2,a1);
[m4 n1]=size(ann13);
disp('Number of Input Nodes 21');
disp('Number of Hidden Nodes 16');
disp('Number of Output Nodes 2');
n=21;
h=16;
m=2;
Tp=3581;%Training pattern
disp('Generate weights between -.0001 to .0001 Randomly');
w=linspace(-.0001,.0001,336);
w1=reshape(w,h,n);
disp('Input layer to Hidden Layer weights');
disp(w1);
for i=1:n
for j=1:h
h1=tanh(w1(j,i)*ann13(j,i)/2);
end
end
disp('Hidden Layer values');
disp(h1);
v=linspace(-.0001,.0001,32);
v1=reshape(v,m,h);
disp('Hidden Layer to Output Layer weights');
disp(v1);
for j=1:h
for k=1:m
y(k)=tanh(v1(k,j)*h1/2);
end
end
disp('Output Layer values');
disp(y(k));
[yr yc]=size(y);
for i=1:n1
for j=i:n1
if ann13(j,i)==t1(j,i)
tk=1;
else
tk=-1;
end
end
end
for p=1:Tp
for k=1:m
if tk==1
e=1/2*sqrt(t1(k)-y(k));
else
e=1/2*sqrt(t(k)-y(k));
end
end
end
disp('The conventional error function of p pattern');
disp(e);
neta=0.001*[(n+1)+(m+1)]/2;%Learning Parameter
for i=1:n
for j=1:n
if tk==1
w1=neta(t1(j,i)-h1(j,i)*y(j,i)*ann13(j,i));
else
w1=neta(t1(j)-h1(j)*y(j)*ann13(j,i));
end
end
end
disp(w1);
0 Kommentare
Antworten (0)
Siehe auch
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!