SVM Cross Validation Training

1 Ansicht (letzte 30 Tage)
Nedz
Nedz am 7 Mai 2020
Beantwortet: Gayathri am 3 Jan. 2025
I am using K-Fold cross validation. My K is 10.
I am supposed to do 10 crossfold and take the average of the SVM performance.
How should i perform such? Running the cross validarion ounce only generates 1 fold prediction or a complete 10-fold prediction?
  1 Kommentar
Mohammad Sami
Mohammad Sami am 8 Mai 2020
According to the documentation it is average over all folds
https://www.mathworks.com/help/releases/R2020a/stats/select-data-and-validation-for-classification-problem.html

Melden Sie sich an, um zu kommentieren.

Antworten (1)

Gayathri
Gayathri am 3 Jan. 2025
Hi @Nedz,
I understand that you need to perform K-fold cross-validation for a SVM model. For this purpose you can use the "crossval" function. And then, "kfoldLoss" function can be used to get the classification loss for cross-validated classification model. Please refer to the code below which implements the same.
load ionosphere
%Train a SVM classifier using the radial basis kernel
SVMModel = fitcsvm(X,Y,'Standardize',true,'KernelFunction','RBF','KernelScale','auto');
%Cross-validate the SVM classifier
CVSVMModel = crossval(SVMModel);
%Estimate the out-of-sample misclassification rate.
classLoss = kfoldLoss(CVSVMModel)
"crossval" by default uses 10-fold cross-validation.
Please refer to the "Train and Cross-Validate SVM Classifier" example in the documentation link mentioned below.
Hope you find this information helpful!

Kategorien

Mehr zu Statistics and Machine Learning Toolbox finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by