calcStats: Calculate Model Performance

Description Usage Arguments Details Value Methods (by class)

Description

calcStats calculates the performance of a deployed model.

Usage

1
2
3
4
5
6
7
8
9
calcStats(object, aucSkip = FALSE, plotSkip = FALSE, verbose = TRUE)

## S4 method for signature 'ExprsPredict'
calcStats(object, aucSkip = FALSE,
  plotSkip = FALSE, verbose = TRUE)

## S4 method for signature 'RegrsPredict'
calcStats(object, aucSkip = FALSE,
  plotSkip = FALSE, verbose = TRUE)

Arguments

object

An ExprsPredict or RegrsPredict object.

aucSkip

A logical scalar. Toggles whether to calculate area under the receiver operating characteristic curve. See Details.

plotSkip

A logical scalar. Toggles whether to plot the receiver operating characteristic curve. See Details.

verbose

A logical scalar. Toggles whether to print the results of model performance to console.

Details

For classification, if the argument aucSkip = FALSE AND the ExprsArray object was an ExprsBinary object with at least one case and one control AND ExprsPredict contains a coherent @probability slot, calcStats will calculate classifier performance using the area under the receiver operating characteristic (ROC) curve via the ROCR package. Otherwise, calcStats will calculate classifier performance traditionally using a confusion matrix. Note that accuracies calculated using ROCR may differ from those calculated using a confusion matrix because ROCR adjusts the discrimination threshold to optimize sensitivity and specificity. This threshold is automatically chosen as the point along the ROC which minimizes the Euclidean distance from (0, 1).

For regression, accuracy is defined the R-squared of the fitted regression. This ranges from 0 to 1 for use with pl and pipe. Note that the aucSkip and plotSkip arguments are ignored for regression.

Value

Returns a data.frame of performance metrics.

Methods (by class)


exprso documentation built on May 1, 2019, 7:11 p.m.