Filter löschen
Filter löschen

How can I make a decision stump using a decision tree ?

7 Ansichten (letzte 30 Tage)
MHN
MHN am 4 Feb. 2016
Kommentiert: MHN am 4 Feb. 2016
Right now, I am using fitctree and pruning the tree to level one, but beacuse the 'SplitCriterion' is 'gdi' and the 'PruneCriterion' is 'error', I can not find the best decision stump to minimize the total error. I know there is an option 'MaxNumSplits' for growing a decision tree, but this option was added on 2015 and I am using 2014 version of Matlab. So, I need to build a decision stump which minimizes the error with respects to the given weights, and I prefer to use fitctree instead of writing that from scratch. Any suggestion ?!
Here is an example that explain my problem more clearly. The correct one must look like this :
I have a full tree like the following tree and I am pruning that until (max-1) level.
Since it is not a decision stump yet, I am pruning that one level more. but it does not the same as the correct one.
And it is the code that I am using:
MinLeafSize = 1;
MinParentSize = 2;
NumVariablesToSample = 'all';
ScoreTransform = 'none';
PruneCriterion= 'error' ;
SplitCriterion = 'gdi';
Weights = Pt; % Train the weak learner by weights Pt
tree = fitctree(X,Y, 'MinLeafSize',MinLeafSize, 'MinParentSize', MinParentSize, 'NumVariablesToSample', NumVariablesToSample, ...
'PruneCriterion',PruneCriterion, 'SplitCriterion', SplitCriterion, 'Weights', Weights, 'ScoreTransform', ScoreTransform);
prune_tree = prune(tree, 'Level', max(tree.PruneList)-1); % prune tree to have decision stump
% if the prune tree still has more than one decision node (three inner nodes) use the max(tree.PruneList) to reduce it to just one node
if length(prune_tree.NodeSize) > 3
prune_tree = prune(tree, 'Level', max(tree.PruneList));
end
P.S. I would like to implement AdaBoostM1 by using Decision Stumps as the weak learner. I know that there is a built in function 'fitensemble' for making an ensemble classifier using AdaBoostM1 algorithm, but I would like to implement it myself.
  1 Kommentar
MHN
MHN am 4 Feb. 2016
Bearbeitet: MHN am 4 Feb. 2016
The default option for using 'fitensemble' by 'AdaBoostM1' method grows such a decision stump. Does anyone know how should we manually set the options for fitctree to have the same decision stump? I have checked the classTreeEns.Trained which shows the tree properties, but since it is a compact classification tree, the pruning information and ModelParameters are removed.

Melden Sie sich an, um zu kommentieren.

Akzeptierte Antwort

Ilya
Ilya am 4 Feb. 2016
Use
fitctree(X,Y,'minparent',size(X,1),'prune','off','mergeleaves','off')
  1 Kommentar
MHN
MHN am 4 Feb. 2016
Thank you very much indeed. I have just added also weights to it.
fitctree(X,Y,'minparent',size(X,1),'prune','off','mergeleaves','off', 'Weights', Weights)

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by