Hi,
I have attached the code I use to classify my data. I use 16 different models. What I want to do is the following:
  1. I want to save/export the model sort of like the Classification Learner app does in order to make predictions on new data.
  2. I want to make a ROC curve with AUC results for each of the models
How can I do that?

 Akzeptierte Antwort

Ridwan Alam
Ridwan Alam am 18 Dez. 2019

1 Stimme

1.Save: (assuming you want to save/export each classifier in separate files) use save().
2. ROC curve: use perfcurve() and plot() with hold on;
% Linear SVM
tic
classificationLinearSVM = fitcsvm(...
trainingData(train,1:end-1),...
trainingData(train,end), ...
'KernelFunction', 'linear', ...
'PolynomialOrder', [], ...
'KernelScale', 'auto', ...
'BoxConstraint', 1, ...
'Standardize', true, ...
'ClassNames', [0; 1]);
[predsLinSVM,~] = predict(classificationLinearSVM,trainingData(test,1:end-1));
targetLinSVM = trainingData(test,end);
targetsLinSVM_all = [targetsLinSVM_all; squeeze(targetLinSVM)];
predsLinSVM_all = [predsLinSVM_all; squeeze(predsLinSVM)];
t1 = toc;
save('classificationLinearSVM.mat','classificationLinearSVM','-v7.3');
% you need to declare the posclass
%
[~,scoresLinSVM] = resubPredict(fitPosterior(classificationLinearSVM));
[xLinSVM,yLinSVM,~,aucLinSVM] = perfcurve(trainingData(train,end),scoresLinSVM(:,2),posclass);
plot(xLinSVM,yLinSVM); hold on;
Hope this helps!

9 Kommentare

Uerm
Uerm am 22 Dez. 2019
Bearbeitet: Uerm am 22 Dez. 2019
Hi,
The code seems to work. However, fitPosterior only works for SVM. I also use k-Nearest Neighbor and Random Forest and this function will not work on those classifiers. Are there "fitPosterior" versions for these classifiers as well?
I get the following error:
Undefined function 'fitPosterior' for input arguments of type 'ClassificationKNN'.
I have tried by simply removing fitPosterior for the kNN and Random Forest classifiers and it seems to work, but I am not sure that it is correctly implemented.
Another things: when we use 'save' to save each of the trained classifiers, how do we use them to make predictions on a new data set (code wise)?
Ridwan Alam
Ridwan Alam am 22 Dez. 2019
Bearbeitet: Ridwan Alam am 22 Dez. 2019
Sure. You can find more details about using perfcurve() here:
After save, you can simply load those classifiers just like any variable, and use predict() or model.predictFcn() as you prefer. More details here:
Uerm
Uerm am 23 Dez. 2019
Cheers, it helped a lot! Is there a difference between saving the model in the for loop or after the loop? Will it make a difference? I have the save function of each model inside the loop now.
Ridwan Alam
Ridwan Alam am 23 Dez. 2019
Good question. That totally depends on the purpose of the loop. If the loop is supposed to help you to find the best performing model among these different types, you don't need to save the models in every iteration, but the models' performances only. After the loop, you compare those results, and find the best model. And retrain that certain kind and save. But if the purpose is different, and you want all intermediate models, you can save those inside the loop. Good luck!
Uerm
Uerm am 5 Jan. 2020
Bearbeitet: Uerm am 5 Jan. 2020
Hi again,
I just want to save the "full" models which will be used to classify new data. It should do the same as the "Export Model" button when using the Classification Learner app.
Note: It seems to save it correctly. When I load the exported model and look at the predictors (features) and response (labels), the number of elements of these is approx 90% of the input data, which makes sense, since it trains on 90% of the input and tests on the last 10% (10-fold cross validation).
Another thing: The way I have used tic and toc... Will they only show the elapsed time for one intermediate result? I want the elapsed time for each individual classifier (full models).
Ridwan Alam
Ridwan Alam am 6 Jan. 2020
As far as I remember, your loop iteration is the number of folds for the cross validation, right? In that case, if you put the save() command inside the loop, it will keep over-writing every iteration, and at the end, you will only have the model (of each kind e.g svm, random forest, etc.) trained during the last iteration. Now, that would be the same if you use the save() outside the loop, since you are using same model names for each iteration. Hope this makes sense.
About tic-toc: if you want to see the amount of time it takes to train each model, put the toc before the predict() part. Otherwise, you are getting time difference including time to predict and squeeze and so on.
Uerm
Uerm am 6 Jan. 2020
Ok, it makes total sense. Regarding the save part... Does it mean that I have to have 10 save commands for each model since it keeps over-writing?
Ridwan Alam
Ridwan Alam am 6 Jan. 2020
Bearbeitet: Ridwan Alam am 6 Jan. 2020
Say, for the SVM models, if you really want to save the 10 SVM models from each iteration, you can either give them a new name in each iteration (eg mySvm_1, mySvm_2, ...) and save all of them after exiting the loop. But, again, I don't think that's very common to save the intermediate models from all the iterations of the cross-validation. Good luck.
Btw, if you liked the conversation, please vote up the response. Thanks!
Uerm
Uerm am 10 Jan. 2020
Hi Ridwan,
Thanks a lot, I voted up the response!
I have run into another problem (I have attached the code). When I plot the confusion matrix and ROC curve, it seems that the results from the training and validation are combined into one. What I mean by this is that for instance in the confusion matrix, when the numbers in the matrix is summed, it is exactly equal to all the samples (training samples + validation samples). I want to have two confusion matrices (and two ROC curves and thus 2 AUC values) for every model --> One for the training and one for the validation. Is that possible?

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Statistics and Machine Learning Toolbox finden Sie in Hilfe-Center und File Exchange

Produkte

Version

R2019b

Gefragt:

am 18 Dez. 2019

Kommentiert:

am 10 Jan. 2020

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by