prauc | R Documentation |
Measure to compare true observed labels with predicted probabilities in binary classification tasks.
prauc(truth, prob, positive, na_value = NaN, ...)
truth |
( |
prob |
( |
positive |
( |
na_value |
( |
... |
( |
Computes the area under the Precision-Recall curve (PRC). The PRC can be interpreted as the relationship between precision and recall (sensitivity), and is considered to be a more appropriate measure for unbalanced datasets than the ROC curve. The PRC is computed by integration of the piecewise function.
This measure is undefined if the true values are either all positive or all negative.
Performance value as numeric(1)
.
Type: "binary"
Range: [0, 1]
Minimize: FALSE
Required prediction: prob
Davis J, Goadrich M (2006). “The relationship between precision-recall and ROC curves.” In Proceedings of the 23rd International Conference on Machine Learning. ISBN 9781595933836.
Other Binary Classification Measures:
auc()
,
bbrier()
,
dor()
,
fbeta()
,
fdr()
,
fnr()
,
fn()
,
fomr()
,
fpr()
,
fp()
,
mcc()
,
npv()
,
ppv()
,
tnr()
,
tn()
,
tpr()
,
tp()
truth = factor(c("a", "a", "a", "b")) prob = c(.6, .7, .1, .4) prauc(truth, prob, "a")
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.