| fsi_async | R Documentation | 
Function to construct a FSelectInstanceAsyncSingleCrit or FSelectInstanceAsyncMultiCrit.
fsi_async(
  task,
  learner,
  resampling,
  measures = NULL,
  terminator,
  store_benchmark_result = TRUE,
  store_models = FALSE,
  check_values = FALSE,
  callbacks = NULL,
  ties_method = "least_features",
  rush = NULL
)
task | 
 (mlr3::Task)  | 
learner | 
 (mlr3::Learner)  | 
resampling | 
 (mlr3::Resampling)  | 
measures | 
 (mlr3::Measure or list of mlr3::Measure)  | 
terminator | 
 (bbotk::Terminator)  | 
store_benchmark_result | 
 (  | 
store_models | 
 (  | 
check_values | 
 (  | 
callbacks | 
 (list of CallbackBatchFSelect)  | 
ties_method | 
 (  | 
rush | 
 (  | 
There are several sections about feature selection in the mlr3book.
Getting started with wrapper feature selection.
Do a sequential forward selection Palmer Penguins data set.
The gallery features a collection of case studies and demos about optimization.
Utilize the built-in feature importance of models with Recursive Feature Elimination.
Run a feature selection with Shadow Variable Search.
If no measure is passed, the default measure is used. The default measure depends on the task type.
| Task | Default Measure | Package | 
"classif"  |  "classif.ce"  | mlr3 | 
"regr"  |  "regr.mse"  | mlr3 | 
"surv"  |  "surv.cindex"  | mlr3proba | 
"dens"  |  "dens.logloss"  | mlr3proba | 
"classif_st"  |  "classif.ce"  | mlr3spatial | 
"regr_st"  |  "regr.mse"  | mlr3spatial | 
"clust"  |  "clust.dunn"  | mlr3cluster | 
# Feature selection on Palmer Penguins data set
task = tsk("penguins")
learner = lrn("classif.rpart")
# Construct feature selection instance
instance = fsi(
  task = task,
  learner = learner,
  resampling = rsmp("cv", folds = 3),
  measures = msr("classif.ce"),
  terminator = trm("evals", n_evals = 4)
)
# Choose optimization algorithm
fselector = fs("random_search", batch_size = 2)
# Run feature selection
fselector$optimize(instance)
# Subset task to optimal feature set
task$select(instance$result_feature_set)
# Train the learner with optimal feature set on the full data set
learner$train(task)
# Inspect all evaluated sets
as.data.table(instance$archive)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.