Description Usage Arguments Value Examples
Decision Forest algorithm: Model training
1 2 3 |
X |
Training Dataset |
Y |
Training data endpoint |
stop_step |
How many extra step would be processed when performance not improved, 1 means one extra step |
Max_tree |
Maximum tree number in Forest |
min_split |
minimum leaves in tree nodes |
cp |
parameters to pruning decision tree, default is 0.1 |
Filter |
doing feature selection before training |
p_val |
P-value threshold measured by t-test used in feature selection, default is 0.05 |
Method |
Which is used for evaluating training process. MIS: Misclassification rate; ACC: accuracy |
Quiet |
if TRUE (default), don't show any message during the process |
Grace_val |
Grace Value in evaluation: the next model should have a performance (Accuracy, bACC, MCC) not bad than previous model with threshold |
imp_accu_val |
improvement in evaluation: adding new tree should improve the overall model performance (Accuracy, bACC, MCC) by threshold |
imp_accu_criteria |
if TRUE, model must have improvement in accumulated accuracy |
.$accuracy: Overall training accuracy
.$pred: Detailed training prediction (fitting)
.$detail: Detailed usage of Decision tree Features/Models and their performances
.$models: Constructed (list of) Decision tree models
.$Method: pass evaluating Methods used in training
.$cp: pass cp value used in training decision trees
1 2 3 4 5 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.