LearnerLightgbm | R Documentation |
The LearnerLightgbm
class is the interface to the lightgbm
R package for
use with the mlexperiments
package.
Optimization metric: needs to be specified with the learner parameter
metric
. The following options can be set via options()
:
"mlexperiments.optim.lgb.nrounds" (default: 5000L
)
"mlexperiments.optim.lgb.early_stopping_rounds" (default: 500L
)
"mlexperiments.lgb.print_every_n" (default: 50L
)
"mlexperiments.lgb.verbose" (default: -1L
)
LearnerLightgbm
can be used with
mlexperiments::MLTuneParameters
mlexperiments::MLCrossValidation
mlexperiments::MLNestedCV
mlexperiments::MLLearnerBase
-> LearnerLightgbm
new()
Create a new LearnerLightgbm
object.
LearnerLightgbm$new(metric_optimization_higher_better)
metric_optimization_higher_better
A logical. Defines the direction of the optimization metric used throughout the hyperparameter optimization.
A new LearnerLightgbm
R6 object.
LearnerLightgbm$new(metric_optimization_higher_better = FALSE)
clone()
The objects of this class are cloneable with this method.
LearnerLightgbm$clone(deep = FALSE)
deep
Whether to make a deep clone.
lightgbm::lgb.train()
, lightgbm::lgb.cv()
# binary classification
library(mlbench)
data("PimaIndiansDiabetes2")
dataset <- PimaIndiansDiabetes2 |>
data.table::as.data.table() |>
na.omit()
seed <- 123
feature_cols <- colnames(dataset)[1:8]
param_list_lightgbm <- expand.grid(
bagging_fraction = seq(0.6, 1, .2),
feature_fraction = seq(0.6, 1, .2),
min_data_in_leaf = seq(10, 50, 10),
learning_rate = seq(0.1, 0.2, 0.1),
num_leaves = seq(10, 50, 10),
max_depth = -1L
)
train_x <- model.matrix(
~ -1 + .,
dataset[, .SD, .SDcols = feature_cols]
)
train_y <- as.integer(dataset[, get("diabetes")]) - 1L
fold_list <- splitTools::create_folds(
y = train_y,
k = 3,
type = "stratified",
seed = seed
)
lightgbm_cv <- mlexperiments::MLCrossValidation$new(
learner = mllrnrs::LearnerLightgbm$new(
metric_optimization_higher_better = FALSE
),
fold_list = fold_list,
ncores = 2,
seed = 123
)
lightgbm_cv$learner_args <- c(
as.list(
data.table::data.table(
param_list_lightgbm[37, ],
stringsAsFactors = FALSE
),
),
list(
objective = "binary",
metric = "binary_logloss"
),
nrounds = 45L
)
lightgbm_cv$performance_metric_args <- list(positive = "1")
lightgbm_cv$performance_metric <- mlexperiments::metric("auc")
# set data
lightgbm_cv$set_data(
x = train_x,
y = train_y
)
lightgbm_cv$execute()
## ------------------------------------------------
## Method `LearnerLightgbm$new`
## ------------------------------------------------
LearnerLightgbm$new(metric_optimization_higher_better = FALSE)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.