tidytune performs hyperparameter tuning by leveraging the rsample
and recipes
packages. Adopting a tidy approach improves readability, reproducibility (I had to google that word) and allows the seamless usage of other tidying tools such as dplyr
.
Currently, the following methods are implemented:
Here's in a nutshell how you would perform a grid search in tidytune
.
In the example below, xgboost_classif_score
is a custom scoring function provided by the user.
xgboost_param_grid <- expand.grid(eta = c(0.1, 0.05), max_depth = c(3, 4))
grid_search(
resamples = resamples,
recipe = rec,
param_grid = xgboost_param_grid,
score_func = xgboost_classif_score,
nrounds = 100,
verbose = FALSE
)
For grid and random search, there's also a batch version in case you need to save or see your results while the search is progressing (especially useful with big models that take a long time to fine tune).
Currently, model based optimization uses a random forest surrogate model under the hood to map parameter values to predicted model performance.
devtools::install_github('artichaud1/tidytune')
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.