flex_resample_metrics: Extracts Classification Metrics during CV in Caret and...

View source: R/flex_caret_resample_metrics.R

flex_resample_metricsR Documentation

Extracts Classification Metrics during CV in Caret and Creates a Table for Publication

Description

A convenience function, that extracts desired classification metrics obtained during training (resampling, cross-validation), with caret, summarizes them (default min, mean, max, for more options see describe) and creates a flextable object. The flextable is than formatted for publication with the format_flextable function. Screenshot of example

Usage

flex_resample_metrics(
  ls,
  nod = 3,
  metrics = c("Accuracy", "Mean_Balanced_Accuracy", "Kappa", "logLoss",
    "Mean_Sensitivity", "Mean_Specificity"),
  descriptives = c("min", "mean", "max"),
  ...
)

Arguments

ls

A list containing the name of the algorithm as index, and the resamples extracted from caret e.g. models <- list("Decision Tree" = decision_tree$resample, "KNN" = knn$resample))

nod

The number of decimals to show for each classification metric

metrics

Metrics that should be extracted from the resamples of the trained caret model. Please note that if you want to have full flexibility of parameters to evaluate you should use the summaryFunction caret::multiClassSummary , in caret::trainControl function. Defaults to a selection taken of multiClassSummary:
metrics = c("Accuracy","Mean_Balanced_Accuracy", "Kappa", "logLoss", "Mean_Sensitivity", "Mean_Specificity" ).

descriptives

Summary stats that shall be calculated from the the resamples obtained in k-fold cross-validation training of a caret machine learning model. Summary stats utilize the describe function for calculation of summary. Per default the function extracts: descriptives = c("min", "mean", "max"). Other alternatives can be seen in the documentaiton of describe and comprise, e.g., median, skew, kurtosis, se

...

(Optional), Additional arguments. to be passed to

flextable

Value

A flextable object with APA ready table that displays the performance metrics obtained during training with cross-validation

Author(s)

Bjoern Buedenbender

See Also

format_flextable, flextable, describe

Examples

## Not run: 
# Create Example Classifiers in the Iris Dataset
# set.seed(7)
# data(iris)
# Settings for the Cross-Validation
# control <- caret::trainControl(method="repeatedcv", number=10, repeats=3,
#                               summaryFunction = caret::multiClassSummary)
# Train Decision Tree
# suppressWarnings(
# decision_tree <- caret::train(Species~., data=iris, method="rpart",
#                                trControl=control, tuneLength=5)
# )
# Train k-Nearest Neighbors
# knn <-  caret::train(Species~., data=iris, method="knn",
#                     trControl=control, tuneLength=5)

# Create a list of objects
# models <- list("Decision Tree" = decision_tree$resample,
              "KNN" = knn$resample)

# save(models)
 models <- data(models)
# Create table with performance metrics during training
flex_resample_metrics(models)

## End(Not run)


Buedenbender/datscience documentation built on Nov. 21, 2022, 11:14 a.m.