Why SVM is not giving expected result

2 Ansichten (letzte 30 Tage)
Diver
Diver am 21 Okt. 2015
Kommentiert: Diver am 23 Okt. 2015
0
I have training data composed from only one feature.
  • The feature have around 113K observation.
  • 8K only of those observation have positive class.
  • 105K of those observation have negative class.
  • The 8K observation composed of a number below 1 (90%), and 10% above 1
  • The 105K observation composed of a number above 1 (80%), and 20% below 1
Hence, almost, any X value below than 1 show be predicted as positive class, and any X value above 1 should be predicted as negative class.
I used the following fitcsvm call:
svmStruct = fitcsvm(X,Y,'Standardize',true, 'Prior','uniform','KernelFunction','linear','KernelScale','auto','Verbose',1,'IterationLimit',1000000);
the fitcsvm give message at the end saying SVM optimization did not converge to the required tolerance., ... but why ... most of first class X values are below 1 and visa versa ... so it should be easy to find classification boundary. and when I run:
[label,score,cost]= predict(svmStruct, X) ;
it gives wrong prediction.
Below a portion of my X values is listed:
0.9911
0.9836
0.9341
0.9751
0.9880
0.9977
0.9853
0.9861
1.0143
1.0086
0.9594
0.9787
0.9927
0.9839
1.0024
0.9931
0.9930
1.0275
  4 Kommentare
Image Analyst
Image Analyst am 21 Okt. 2015
Can you help us by sending your classes to gscatter() and showing us a screenshot?
Diver
Diver am 23 Okt. 2015
my understanding gscatter() require X and Y ( 2 features) in addition to group, however, in my case I only have 1 feature, in addition to group, hence, I dont think I use gscatter to plot 1 feature.

Melden Sie sich an, um zu kommentieren.

Antworten (0)

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by