Description Usage Arguments Details Value
Create a AutoTuner to set the random forest hyperparameter tuning strategy.
1 2 3 4 5 6 7 8 | set_tuning(
in_learner,
in_measure,
nfeatures,
insamp_nfolds,
insamp_neval,
insamp_nbatch
)
|
in_learner |
Learner whose hyperparameters to tune. |
in_measure |
performance measure to use for hyperparameter tuning. The hyperparameter configuration that optimizes this measure will be selected and used for final model training. |
nfeatures |
total number of predictor variables in the model. |
insamp_nfolds |
number of cross-validation folders used for tuning |
insamp_nbatch |
number of hyperparameter configurations to evaluate at the same time. This will dictate how many processing cores will be used in hyperparameter tuning. The greater this number, the faster the tuning, but also the more computing intensive. This shouldn't be set higher than the number of cores on the computer used. |
insamp_neva |
number of times the cross-validation must be conducted (e.g. if 2, then twice-repeated CV) |
For more information on hyperparameter tuning and a table of the range of hyperparameters and tuning strategies used, see sections IVa and IVb in the Supplementary Information of Messager et al. 2021 at https://www.nature.com/articles/s41586-021-03565-5.
a AutoTuner.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.