xgboostOptimisation: xgboost parameter optimisation

xgboostOptimisationR Documentation

xgboost parameter optimisation

Description

Classification parameter optimisation for the xgboost algorithm.

Usage

xgboostOptimisation(
  object,
  fcol = "markers",
  max_depth = 4:7,
  gamma = seq(2.6, 3.2, 0.3),
  nrounds = 100,
  times = 100,
  test.size = 0.2,
  xval = 5,
  fun = mean,
  seed,
  verbose = TRUE,
  ...
)

Arguments

object

An instance of class "MSnSet".

fcol

The feature meta-data containing marker definitions. Default is markers.

max_depth

The hyper-parameter; the max depth of the tree. Default values are 4:7.

gamma

The hyper-parameter; Minimum loss reduction required to make a further partition on a leaf node of the tree. The larger gamma is, the more conservative the algorithm will be. Default values are seq(2.6,3.2, 0.3).

nrounds

The max number of boosting iterations.

times

The number of times internal cross-validation is performed. Default is 100.

test.size

The size of test data. Default is 0.2 (20 percent).

xval

The n-cross validation. Default is 5.

fun

The function used to summarise the xval macro F1 matrices.

seed

The optional random number generator seed.

verbose

A logical defining whether a progress bar is displayed.

...

Additional parameters passed to xgb.train from package xgboost.

Details

Note that when performance scores precision, recall and (macro) F1 are calculated, any NA values are replaced by 0. This decision is motivated by the fact that any class that would have either a NA precision or recall would result in an NA F1 score and, eventually, a NA macro F1 (i.e. mean(F1)). Replacing NAs by 0s leads to F1 values of 0 and a reduced yet defined final macro F1 score.

Value

An instance of class "GenRegRes".

See Also

xgboostClassification and example therein.


mgerault/pRolocExtra documentation built on Sept. 15, 2022, 9:26 a.m.