Filter löschen
Filter löschen

Calculate Sensitivity and Specificity from Code generated from Classification Learner

54 Ansichten (letzte 30 Tage)
I have trained my dataset in the classification learner app and tried to calculate classification performance using leave-one-out cross-validation. Since classification learner doesn't support this configuration of K-fold, I used the way of generating the code for training the currently selected model.
I have tried to compute the sensitivity and specificity but all the ways I found depend on predicted class labels and I can't get the resulted class labels since it is not a new dataset. I just want to evaluate the trained model.
Is any way to evaluate the sensitivity and specifity or the confusion matrix from Classification Learner App Code generated?

Akzeptierte Antwort

Sarah Ayyad
Sarah Ayyad am 28 Sep. 2021
Bearbeitet: Sarah Ayyad am 28 Sep. 2021
I computed all performance metrics by the following way
[validationPredictions, validationScores] = kfoldPredict(partitionedModel);
confmat = confusionmat(response,validationPredictions) % where response is the last column in the dataset representing a class
TP = confmat(2, 2);
TN = confmat(1, 1);
FP = confmat(1, 2);
FN = confmat(2, 1);
Accuracy = (TP + TN) / (TP + TN + FP + FN);
Sensitivity = TP / (FN + TP);
specificity = TN / (TN + FP);
z = FP / (FP+TN);
X = [0;Sensitivity;1];
Y = [0;z;1];
AUC = trapz(Y,X); % This way is used for only binary classification

Weitere Antworten (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by