calculate the classification accuracy after training a "pretrained model"
8 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Rayan Matlob
am 28 Jun. 2022
Kommentiert: Dehia
am 2 Okt. 2023
how to calcualte the MSE, MAE RMSE or any other classification accuracy of a pretrained model?
next is my code:
imds = imageDatastore('C:\Users\Rayan\Desktop\Work\9_5_work_on_4_groups\9_1\R_9_1_GSM', ...
'IncludeSubfolders',true, ...
'LabelSource','foldernames');
[imdsTrain,imdsValidation] = splitEachLabel(imds,0.7,'randomized');
numTrainImages = numel(imdsTrain.Labels);
idx = randperm(numTrainImages,16);
net = resnet50;
deepNetworkDesigner(net)
analyzeNetwork(net)
inputSize = net.Layers(1).InputSize;
lgraph = layerGraph(net);
edit(fullfile(matlabroot,'examples','nnet','main','findLayersToReplace.m'))
[learnableLayer,classLayer] = findLayersToReplace(lgraph);
[learnableLayer,classLayer] %#ok<NOPTS>
numClasses = numel(categories(imdsTrain.Labels));
%numClasses = 3
if isa(learnableLayer,'nnet.cnn.layer.FullyConnectedLayer')
newLearnableLayer = fullyConnectedLayer(numClasses, ...
'Name','new_fc', ...
'WeightLearnRateFactor',10, ...
'BiasLearnRateFactor',10);
elseif isa(learnableLayer,'nnet.cnn.layer.Convolution2DLayer')
newLearnableLayer = convolution2dLayer(1,numClasses, ...
'Name','new_conv', ...
'WeightLearnRateFactor',10, ...
'BiasLearnRateFactor',10);
end
lgraph = replaceLayer(lgraph,learnableLayer.Name,newLearnableLayer);
newClassLayer = classificationLayer('Name','new_classoutput');
lgraph = replaceLayer(lgraph,classLayer.Name,newClassLayer);
layers = lgraph.Layers;
connections = lgraph.Connections;
layers(1:20) = freezeWeights(layers(1:20));
lgraph = createLgraphUsingConnections(layers,connections);
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain)
augimdsValidation = augmentedImageDatastore(inputSize(1:2),imdsValidation);
miniBatchSize=10;
valFrequency = floor(numel(augimdsTrain.Files)/miniBatchSize);
options = trainingOptions('sgdm', ...
'MiniBatchSize',10, ...
'MaxEpochs',6, ...
'InitialLearnRate',0.0007, ...
'Shuffle','every-epoch', ...
'ValidationFrequency',valFrequency, ...
'ValidationData',augimdsValidation, ...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(augimdsTrain,lgraph,options);
[YPred,probs] = classify(net,augimdsValidation);
accuracy = mean(YPred == imdsValidation.Labels);
idx = randperm(numel(imdsValidation.Files),100);
R=1;
for j =1:24
figure(j)
for i = 1:4
subplot(2,2,i)
I = readimage(imdsValidation,idx(R));
imshow(I)
label = YPred(idx(R));
title(string(label) + ", " + num2str(100*max(probs(idx(R),:)),3) + "%");
R=R+1;
end
end
0 Kommentare
Akzeptierte Antwort
Andreas Apostolatos
am 28 Jun. 2022
Hi Rayan,
From the code snippet you share it appears that you are training a neural network for classification while you are then performing inference for some validation data,
net = trainNetwork(augimdsTrain,lgraph,options);
[YPred,probs] = classify(net,augimdsValidation);
accuracy = mean(YPred == imdsValidation.Labels);
Error measures such as the Mean Squarer Error (MSE) or the Root Mean Square Error (RMSE) are suited for regression problems where the response variables are continuous and not for classification problems.
To evaluate the performance of a classifier it is more appropriate to use a Confusion Matrix or to compute the percentage of responses that have been correctly predicted by the classifier. The corresponding workflow is underlined in the following link,
I hope that you find this information useful for needs.
Kind regards
Andreas
2 Kommentare
Dehia
am 2 Okt. 2023
Could you assist me in calculating the F-score, recall, sensitivity, and ROC curve, please?
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Function Approximation and Clustering finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!