Description Usage Arguments Value Examples
Decision Forest algorithm: Model training In Preferred-2 settings, we used KNN instead of Decision Tree
1 2 3 | DFp2_train(X, Y, stop_step = 5, Max_tree = 20, Max_feat = 5, k = 5,
grace_threshold = 0.01, Filter = F, p_val = 0.05, Method = "bACC",
Quiet = T)
|
X |
Training Dataset |
Y |
Training data endpoint |
stop_step |
How many extra step would be processed when performance not improved, 1 means one extra step |
Max_tree |
Maximum tree number in Forest |
Max_feat |
maximum occurrence of features in the forest |
Filter |
doing feature selection before training |
p_val |
P-value threshold measured by t-test used in feature selection, default is 0.05 |
Method |
Which is used for evaluating training process. MIS: Misclassification rate; ACC: accuracy |
Quiet |
if TRUE (default), don't show any message during the process |
min_leaf |
minimum leaves in tree nodes |
cp |
parameters to pruning decision tree, default is 0.1 |
Grace_ACC |
Grace Value in evaluation: the next model should have a performance (Accuracy) not bad than previous model with threshold |
imp_ACC_accu |
improvement in evaluation: adding new tree should improve the overall model performance (accuracy) by threshold |
Grace_bACC |
Grace Value in evaluation: (Balanced Accuracy) |
imp_bACC_accu |
improvement in evaluation: (Balanced Accuracy) |
Grace_MCC |
Grace Value in evaluation: (MCC) |
imp_MCC_accu |
improvement in evaluation: (MCC) |
Grace_MIS |
Grace Value in evaluation: (MIS) |
imp_MIS_accu |
improvement in evaluation: (MIS) |
Result: Training Model/Performance by KNN
1 2 3 4 5 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.