Tuning Parameters for Boosting/Bagging/Random Forest
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Hello
I want to use tree-based classifiers for my classifiaction problem. I'm thinking about bagging, boosting (AdaBoost, LogitBoost, RUSBoost) and Random Forest but I'm unsure about the tuning parameters, i.e. which range I should search.
I'm using the TreeBagger and fitensemble method from Matlab. I'm unsure about the following parameters:
- Number of iterations / Trees
- Sampling with or without replacement? If without replacement what in bag fraction to take?
- Minimum Leaf Size
- Minimum Parent Size
- Maximum number of decision splits
- Learning rate for shrinkage
- RatioToSmallest (Every element of this vector is the sampling proportion for this class with respect to the class with fewest observations). I have highly imbalanced classes.
- MarginPrecision
- (The level of pruning and value of the pruning cost the tree should pruned to (alpha))
I would be very happy if somebody could give a quick help.
0 Kommentare
Antworten (0)
Siehe auch
Kategorien
Mehr zu Classification finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!