auc_pr: Calculates AUC Precision Recall for continious predictions,...

Description Usage Arguments Value Examples

Description

This function takes the predictions of a model, (can be either binary 0 or 1, or continous numeric [0,1]) and calculates the precision recall curve, and then gets the auc under the curve given the predicted values and actual values Notes : baseline is ~.35, and perfect is 1

Usage

1
auc_pr(predictions, outcomes)

Arguments

predictions

list of numerics, predicted values

outcomes

list of numerics, actual values/outcomes

Value

numeric, returns AUC precision recall value

Examples

1
2
auc_pr(predictions = FakePredictionResults$est.risk.score,
outcomes = FakePredictionResults$true.risk.bin)

ksboxer/CDIPATools documentation built on June 5, 2019, 8:29 a.m.