calcPerfMC: Calculate classifier performance metrics (multi-class)

Description Usage Arguments Value Examples

View source: R/calcPerf.R

Description

Calculate classifier performance metrics (multi-class)

Usage

1
calcPerfMC(confusion, metrics, avg.method = NULL, melt = F)

Arguments

confusion

A matrix, data frame; or a list of matrices or data frames. These should contain a confusion matrix for one cutoff or confusion matrices for multiple cutoffs.

metrics

A character vector of the desired statistics. Multiple metrics can be specified; however, usually only two are used for binary classification statistics plots.

avg.method

'macro': Simple average. Performance metrics are calculated individually for each class, and then averaged. 'weighted': Weighted average. Similar to 'macro', except that the performance metrics for each class are weighted by the relative contribution of each class (i.e. classes with more samples have more weighting)

melt

If melt = FALSE, output in wide format. If melt = TRUE, output in long format

Value

A numeric vector or matrix of the selected performance metrics

Examples

1
calcPerf(confusion, c('tpr','tnr'), avg.method = 'macro')

luannnguyen/mltoolkit documentation built on Aug. 29, 2020, 8:31 a.m.