What is box constraint in svmtrain fucntion ?

63 Ansichten (letzte 30 Tage)
chinnurocks
chinnurocks am 29 Aug. 2016
Beantwortet: CHAABANE BOUZIDI am 16 Apr. 2022
Heyy,
I would like to know what exactly is box constraint in svmtrain function. I went through documentation of matlab, but I didn't understand it properly. Please help.

Akzeptierte Antwort

Brendan Hamm
Brendan Hamm am 29 Aug. 2016
The basic idea is that when the data is not perfectly separable, the training algorithm must allow some mis-classification in the training set. In this case it is applying a cost to the misclassification. The higher the box-constraint the higher the cost of the misclassified points, leading to a more strict separation of the data.
  7 Kommentare
Atheer Alkubeyyer
Atheer Alkubeyyer am 10 Apr. 2019
Bearbeitet: Atheer Alkubeyyer am 10 Apr. 2019
Thank you sir for your answer. I am trying to build three models SVM for binary classification. hard, soft and kernel(RBF) and compare thier performance.
What do you suggest setting the boxconstraint value for a hard SVM? or is there a function that dose SVM stightforward withought setting panelty parameter (boxconstraint)?
Also, is it applicable to use OptimizeHyperparameters options to get the optimal value for the BoxConstraint for soft margin SVM?
Brendan Hamm
Brendan Hamm am 10 Apr. 2019
In all honesty, I struggle with why one would want a hard SVM as practically speaking it is rare to have perfectly separable data ... I've only seen this with toy examples. So, that being said, I'm not sure the answer, but will give some guidance into how I would approach the problem.
I have not experimented with what should be considered sufficiently large to approximate a hard SVM, so some experimentation may be necessary. Without having tried I imagine Inf might lead to issues in convergence, but this just might do the trick. I would suggest working on a 2-d set of made up data w/ one data point over the boundary (non-separable) to experiment with this and determine whether a value of Inf provides a solution for this problem (I assume not). Then you can remove that one point and repeat and see if you get a solution, if so, it seems Inf does what you want, if not repeat and try a large value such as 1e10 and see if this works on the problems. Repeat until you are satisfied with a sufficiently large value.
Yes, it is reasonable to use the OptimizeHyperparameters to get the BoxConstraint. This will find a local minimum which has the lowest (k-fold) loss over a k-fold (k=5 by default) crossvalidation set.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (1)

CHAABANE BOUZIDI
CHAABANE BOUZIDI am 16 Apr. 2022
You have to write a script that implements a support vector machine.
  • Make sure to give as input parameter the constant C.
Training
  • Solve the quadratic dual problem as discussed in the slides and find the Lagrange multipliers a_n for n=1,…,N where N is the number of training rows in the training file. You can use a QP solver in Matlab (e.g., quadprog) or python (using e.g., cvxopt) that follows the format we discussed during the lecture. This is the stage where you use the training data.
  • Once you find the multipliers a_n you can use them to determine the \beta, \beta_0 parameters of the decision boundary.
  • Since there are more than 2 classes in the dataset you will apply one-vs-one classification. This will require constructing SVMs for all possible pairs of classes in your dataset. Remember here you have 45 possible pairs of classes. For each pair you will isolate the corresponding rows of the training file as discussed in the class.
Testing
  • Evaluate p(x) for the testing data in order to classify them when considering pairs of classes.
  • Do not forget to normalize all the data by dividing with the max value among the input attributes. Do not normalize the labels.
Report your classification accuracy for different values of C both for linear and RBF kernels. Consider values 5-8 values for C in interval [0.1, 5] Report the classification accuracy in answers.pdf
Here are a few details about the training and testing files:
  • USPS handwritten digits: Each row of the training file contains 17 numbers. The first 16 correspond to features extracted from a digital image of a handwritten digit. The last column is the label which indicates what digit the first 16 numbers represent. There are 10 different classes.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by