xgboostOptimisation | R Documentation |
Classification parameter optimisation for the xgboost algorithm.
xgboostOptimisation( object, fcol = "markers", max_depth = 4:7, gamma = seq(2.6, 3.2, 0.3), nrounds = 100, times = 100, test.size = 0.2, xval = 5, fun = mean, seed, verbose = TRUE, ... )
object |
An instance of class |
fcol |
The feature meta-data containing marker definitions.
Default is |
max_depth |
The hyper-parameter; the max depth of the tree. Default values are |
gamma |
The hyper-parameter; Minimum loss reduction required to make a further partition on
a leaf node of the tree. The larger gamma is, the more conservative the algorithm will be.
Default values are |
nrounds |
The max number of boosting iterations. |
times |
The number of times internal cross-validation is performed. Default is 100. |
test.size |
The size of test data. Default is 0.2 (20 percent). |
xval |
The |
fun |
The function used to summarise the |
seed |
The optional random number generator seed. |
verbose |
A |
... |
Additional parameters passed to |
Note that when performance scores precision, recall and (macro) F1 are calculated, any NA values are replaced by 0. This decision is motivated by the fact that any class that would have either a NA precision or recall would result in an NA F1 score and, eventually, a NA macro F1 (i.e. mean(F1)). Replacing NAs by 0s leads to F1 values of 0 and a reduced yet defined final macro F1 score.
An instance of class "GenRegRes"
.
xgboostClassification
and example therein.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.