Tuning Parameters for Boosting/Bagging/Random Forest
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hello
I want to use tree-based classifiers for my classifiaction problem. I'm thinking about bagging, boosting (AdaBoost, LogitBoost, RUSBoost) and Random Forest but I'm unsure about the tuning parameters, i.e. which range I should search.
I'm using the TreeBagger and fitensemble method from Matlab. I'm unsure about the following parameters:
- Number of iterations / Trees
- Sampling with or without replacement? If without replacement what in bag fraction to take?
- Minimum Leaf Size
- Minimum Parent Size
- Maximum number of decision splits
- Learning rate for shrinkage
- RatioToSmallest (Every element of this vector is the sampling proportion for this class with respect to the class with fewest observations). I have highly imbalanced classes.
- MarginPrecision
- (The level of pruning and value of the pruning cost the tree should pruned to (alpha))
I would be very happy if somebody could give a quick help.
0 comentarios
Respuestas (0)
Ver también
Categorías
Más información sobre Classification en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!