Testing Multilayer Perceptron Against Noise
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
The following script is from Trappenberg's Fundamentals of Computational Neuroscience and is used to test a perceptron's robustness against noise.
However, how would one alter it to test the output of a multilayer perceptron?
In particular, wOut and rIn are non-comformable because the wOut of a multilayer perceptron with 2 hidden layers is 26*2.
So how could I alter this to test the trained multilayer perceptron?
%% Testing generalization perfroamnce of trained perceptron
perceptron_network_sim;
letterMatrix=rIn;
for nflip = 1:80;
dist1=[]; dist2=[];
for trial=1:10;
rIn=abs(letterMatrix-randomFlipMatrix(nflip));
% Threshold output function
rOut1=(wOut*rIn)>0;
nerror=0;
for j=1:26; nerror=nerror+(sum(rDes(:,j)~=rOut1(:,j))>0); end
dist1=[dist1,nerror];
% Max output function
[v,i]=max(wOut*rIn);
rOut2=zeros(26); for j=1:26; rOut2(i(j),j)=1; end
dist2=[dist2,0.5*sum(sum(rDes~=rOut2))];
end
meanDist1(nflip)=mean(dist1); stdDist1(nflip)=std(dist1);
meanDist2(nflip)=mean(dist2); stdDist2(nflip)=std(dist2);
end
figure; hold on;
errorbar((1:80)/156,meanDist1,stdDist1,':')
errorbar((1:80)/156,meanDist2,stdDist2,'r')
xlabel('Fraction of flipped bits')
ylabel('Average number of wrong letters')
0 Kommentare
Antworten (0)
Siehe auch
Kategorien
Mehr zu Define Shallow Neural Network Architectures finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!