View source: R/train_models_bayesopt.R
train_models_bayesopt | R Documentation |
Bayesian Optimization takes relatively a long time - the bigger 'iters.n' param, the more time (but if you want to get model parameters better than default params, it is suggested to set 'iters.n' equals 20 at least. Also the bigger dataset, the more time takes Bayesian Optimization.
train_models_bayesopt(
train_data,
y,
time,
status,
test_data,
engine,
type,
parallel = FALSE,
iters.n = 7,
bayes_info = list(verbose = 0, plotProgress = FALSE),
return_params = FALSE,
verbose = TRUE
)
train_data |
A training data for models created by 'prepare_data()' function. |
y |
A string that indicates a target column name for regression or classification. Either y, or pair: time, status can be used. |
time |
A string that indicates a time column name for survival analysis task. Either y, or pair: time, status can be used. |
status |
A string that indicates a status column name for survival analysis task. Either y, or pair: time, status can be used. |
test_data |
A test data for models created by 'prepare_data()' function. |
engine |
A vector of tree-based models that shall be created. Possible values are: ‘ranger', 'xgboost','decision_tree', 'lightgbm', 'catboost'. Doesn’t matter for survival analysis. |
type |
A string that determines if Machine Learning task is the 'binary_clf', 'regression', or 'survival'. |
parallel |
A logical value, if set to TRUE, the function will use parallel computing. By default set to FALSE. |
iters.n |
The number of iterations of BayesOpt function. |
bayes_info |
A list with two values, determining the verbosity of the Bayesian Optmization process. The first value is 'verbose' with 3 levels: 0 - no output; 1 - describes what is hapenning, and if we can reach local optimum; 2 - addtionally provides infromation about recent, and the best scores. The second value is 'plotProgress', which is a logical value indicating if the progress of the Bayesian Optimization should be plotted. WARNING it will create plot after each step, thus it might be computationally expensive. Both arguments come from the 'ParBayesianOptimization' package. It only matters if you set global verbose to TRUE. Default values are: list(verbose = 0, plotProgress = FALSE). |
return_params |
A logical value, if set to TRUE, returns optimized model parameters. |
verbose |
A logical value, if set to TRUE, provides all information about the process, if FALSE gives none. |
Trained models with optimized parameters. If 'return_params' is 'TRUE', then returns also training parameters in the one list with models.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.