precision: Precision

View source: R/metrics.R

precisionR Documentation

Precision

Description

Given the observed and predicted values of categorical data (of any number of classes) computes the precision, that represents the ratio of true positives to total predicted positives.

Usage

precision(observed, predicted, positive_class = NULL, remove_na = TRUE)

Arguments

observed

(factor) The observed values. It has to have the same length as predicted.

predicted

(factor) The observed values. It has to have the same length as observed.

positive_class

(character(1)) The name of the class (level) to be taken as reference as the positive class. This parameter is only used for binary variables. NULL by default which uses the second class in the union of the classes (levels) in observed and predicted.

remove_na

(logical(1)) Should NA values be removed?. TRUE by default.

Details

Given the following binary confusion matrix:

Binary confusion matrix

Precision is computed as:

(TP) / (TP + FP)

Value

For binary data a single value is returned, for more than 2 categories a vector of precisions is returned, one per each category.

See Also

Other categorical_metrics: accuracy(), brier_score(), categorical_summary(), confusion_matrix(), f1_score(), kappa_coeff(), math_mode(), matthews_coeff(), pccc(), pcic(), pr_auc(), recall(), roc_auc(), sensitivity(), specificity()

Examples

## Not run: 
precision(factor(c("a", "b")), factor(c("a", "b")))
precision(factor(c("a", "b")), factor(c("b", "a")))
precision(factor(c("a", "b")), factor(c("b", "b")))
precision(factor(c(TRUE, FALSE)), factor(c(FALSE, TRUE)))
precision(factor(c("a", "b", "a")), factor(c("b", "a", "c")))

## End(Not run)


brandon-mosqueda/SKM documentation built on Feb. 8, 2025, 5:24 p.m.