confusion_matrix: This script contains all the code for ML performance metrics...

Description Usage Arguments Value Examples

Description

This function produces a confusion matrix - a table that displays the false positive (FP), false negative (FN), true positive(TP), and true negative (results) by comparing a set of predictions to true values. The predictions can either be binary or continuous. For continuous predictions, a threshold for translating the results to binary classifications must be supplied. If the predictions are already binary, then pass in .5. (KP: Why not just null?)

Usage

1
confusion_matrix(predictions, outcomes, threshold)

Arguments

predictions

vector of numerics, predicted values

outcomes

vector of numerics, actual values/outcomes, if binary we need to a 0/1

threshold

numeric, value between 0 and 1 to translate continuous predictions to binary classifications

Value

list returns a list object that includes a confusion matrix table, accuracy, kappa statistics etc

Examples

1
2
confusion_matrix(predictions = FakePredictionResults$est.risk.score,
outcomes = FakePredictionResults$true.risk.bin, threshold = 0.5)

ksboxer/CDIPATools documentation built on June 5, 2019, 8:29 a.m.