accuracy: Accuracy

View source: R/metrics.R

accuracyR Documentation

Accuracy

Description

Given the observed and predicted values of categorical data (of any number of classes) computes the accuracy (also known as proportion of correctly classified cases).

Usage

accuracy(observed, predicted, remove_na = TRUE)

Arguments

observed

(factor) The observed values. It has to have the same length as predicted.

predicted

(factor) The observed values. It has to have the same length as observed.

remove_na

(logical(1)) Should NA values be removed?. TRUE by default.

Details

Accuracy can be computed as:

(1 / N) * (sum(diag(confusion_matrix)))

that is, the sum of the diagonal in the confusion matrix (correct classifications) over the total number of values in the matrix (N). An equivalent but more efficient method is used.

Value

A single numeric value with the accuracy.

See Also

Other categorical_metrics: brier_score(), categorical_summary(), confusion_matrix(), f1_score(), kappa_coeff(), math_mode(), matthews_coeff(), pccc(), pcic(), pr_auc(), precision(), recall(), roc_auc(), sensitivity(), specificity()

Examples

## Not run: 
accuracy(c("a", "b"), c("a", "b"))
accuracy(c("a", "b"), c("b", "a"))
accuracy(c("a", "b"), c("b", "b"))
accuracy(c("a", "b", "a"), c("b", "a", "c"))

## End(Not run)


brandon-mosqueda/SKM documentation built on Feb. 8, 2025, 5:24 p.m.