View source: R/setup_hyperparameters.R
get_hyperparameter_defaults | R Documentation |
Get hyperparameter values
get_hyperparameter_defaults( models = get_supported_models(), n = 100, k = 10, model_class = "classification" ) get_random_hyperparameters( models = get_supported_models(), n = 100, k = 10, tune_depth = 5, model_class = "classification" )
models |
which algorithms? |
n |
Number observations |
k |
Number features |
model_class |
"classification" or "regression" |
tune_depth |
How many combinations of hyperparameter values? |
Get hyperparameters for model training.
get_hyperparameter_defaults
returns a list of 1-row data frames
(except for glm, which is a 10-row data frame) with default hyperparameter
values that are used by flash_models
.
get_random_hyperparameters
returns a list of data frames with
combinations of random values of hyperparameters to tune over in
tune_models
; the number of rows in the data frames is given by
'tune_depth'.
For get_hyperparameter_defaults
XGBoost defaults are from caret and XGBoost documentation:
eta = 0.3, gamma = 0, max_depth = 6, subsample = 0.7,
colsample_bytree = 0.8, min_child_weight = 1, and nrounds = 50.
Random forest defaults are from Intro to
Statistical Learning and caret: mtry = sqrt(k), splitrule = "extratrees",
min.node.size = 1 for classification, 5 for regression.
glm defaults are
from caret: alpha = 1, and because glmnet fits sequences of lambda nearly as
fast as an individual value, lambda is a sequence from 1e-4 to 8.
Named list of data frames. Each data frame corresponds to an
algorithm, and each column in each data fram corresponds to a hyperparameter
for that algorithm. This is the same format that should be provided to
tune_models(hyperparameters = )
to specify hyperparameter values.
models
for model and hyperparameter details
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.