Description Usage Arguments Details Value Methods (by class)
calcStats calculates the performance of a deployed model.
1 2 3 4 5 6 7 8 9 |
object |
An |
aucSkip |
A logical scalar. Toggles whether to calculate area under the receiver operating characteristic curve. See Details. |
plotSkip |
A logical scalar. Toggles whether to plot the receiver operating characteristic curve. See Details. |
verbose |
A logical scalar. Toggles whether to print the results of model performance to console. |
For classification, if the argument aucSkip = FALSE AND the ExprsArray
object was an ExprsBinary object with at least one case and one control AND
ExprsPredict contains a coherent @probability slot, calcStats
will calculate classifier performance using the area under the receiver operating
characteristic (ROC) curve via the ROCR package. Otherwise, calcStats
will calculate classifier performance traditionally using a confusion matrix.
Note that accuracies calculated using ROCR may differ from those calculated
using a confusion matrix because ROCR adjusts the discrimination threshold to
optimize sensitivity and specificity. This threshold is automatically chosen as the
point along the ROC which minimizes the Euclidean distance from (0, 1).
For regression, accuracy is defined the R-squared of the fitted regression. This
ranges from 0 to 1 for use with pl and pipe. Note that
the aucSkip and plotSkip arguments are ignored for regression.
Returns a data.frame of performance metrics.
ExprsPredict: Method to calculate performance for classification models.
RegrsPredict: Method to calculate performance for continuous outcome models.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.