precision | R Documentation |
Given the observed and predicted values of categorical data (of any number of classes) computes the precision, that represents the ratio of true positives to total predicted positives.
precision(observed, predicted, positive_class = NULL, remove_na = TRUE)
observed |
( |
predicted |
( |
positive_class |
( |
remove_na |
( |
Given the following binary confusion matrix:
Precision is computed as:
For binary data a single value is returned, for more than 2 categories a vector of precisions is returned, one per each category.
Other categorical_metrics:
accuracy()
,
brier_score()
,
categorical_summary()
,
confusion_matrix()
,
f1_score()
,
kappa_coeff()
,
math_mode()
,
matthews_coeff()
,
pccc()
,
pcic()
,
pr_auc()
,
recall()
,
roc_auc()
,
sensitivity()
,
specificity()
## Not run:
precision(factor(c("a", "b")), factor(c("a", "b")))
precision(factor(c("a", "b")), factor(c("b", "a")))
precision(factor(c("a", "b")), factor(c("b", "b")))
precision(factor(c(TRUE, FALSE)), factor(c(FALSE, TRUE)))
precision(factor(c("a", "b", "a")), factor(c("b", "a", "c")))
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.