set_tuning: Set tuning strategy

Description Usage Arguments Details Value

Description

Create a AutoTuner to set the random forest hyperparameter tuning strategy.

Usage

1
2
3
4
5
6
7
8
set_tuning(
  in_learner,
  in_measure,
  nfeatures,
  insamp_nfolds,
  insamp_neval,
  insamp_nbatch
)

Arguments

in_learner

Learner whose hyperparameters to tune.

in_measure

performance measure to use for hyperparameter tuning. The hyperparameter configuration that optimizes this measure will be selected and used for final model training.

nfeatures

total number of predictor variables in the model.

insamp_nfolds

number of cross-validation folders used for tuning

insamp_nbatch

number of hyperparameter configurations to evaluate at the same time. This will dictate how many processing cores will be used in hyperparameter tuning. The greater this number, the faster the tuning, but also the more computing intensive. This shouldn't be set higher than the number of cores on the computer used.

insamp_neva

number of times the cross-validation must be conducted (e.g. if 2, then twice-repeated CV)

Details

For more information on hyperparameter tuning and a table of the range of hyperparameters and tuning strategies used, see sections IVa and IVb in the Supplementary Information of Messager et al. 2021 at https://www.nature.com/articles/s41586-021-03565-5.

Value

a AutoTuner.


NaiaraLopezRojo/globalIRmap documentation built on Dec. 17, 2021, 5:19 a.m.