Description Usage Arguments Details Value Examples
A Flexible Boosting Algorithm With Adaptive Loss Functions
1 2 |
X |
Variable of train data |
y |
Label of train data |
n_rounds |
How many trees gonna make |
interval |
Parameter to change Exp Loss-Function |
width |
Searching area (more than 1) |
type |
Tie evaluation option (1 or 2, recommed 2) |
control |
fix cp = -1, maxdepth = 1 based on AdaBoost |
This is a main algorithm of FlexBoost: like other Boosting packages, it returns compatible information. In order to prevent unexpected errors, missing data should not be allowed in input data. Return value is composed of four major parts (e.g. terms, trees, alphas, acc). terms : Input variable information trees : Decision tree information alphas : Weight of weak classifier acc : Train accuracy of each iteration
Returns decision tree informations (e.g. Split criteria, Weight of weak classifier, Train accuracy)
1 2 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.