pccc: Proportion of Correctly Classified Cases (PCCC)

View source: R/metrics.R

pcccR Documentation

Proportion of Correctly Classified Cases (PCCC)

Description

Given the observed and predicted values of categorical data (of any number of classes) computes the Proportion of Correctly Classified Cases (also known as accuracy).

Usage

pccc(observed, predicted, remove_na = TRUE)

Arguments

observed

(factor) The observed values. It has to have the same length as predicted.

predicted

(factor) The observed values. It has to have the same length as observed.

remove_na

(logical(1)) Should NA values be removed?. TRUE by default.

Details

PCCC can be computed as:

(1 / N) * (sum(diag(confusion_matrix)))

that is, the sum of the diagonal in the confusion matrix (correct classifications) over the total number of values in the matrix (N). An equivalent but more efficient method is used.

Value

A single numeric value with the Proportion of Correctly Classified Cases.

See Also

Other categorical_metrics: accuracy(), brier_score(), categorical_summary(), confusion_matrix(), f1_score(), kappa_coeff(), math_mode(), matthews_coeff(), pcic(), pr_auc(), precision(), recall(), roc_auc(), sensitivity(), specificity()

Examples

## Not run: 
pccc(c("a", "b"), c("a", "b"))
pccc(c("a", "b"), c("b", "a"))
pccc(c("a", "b"), c("b", "b"))
pccc(c("a", "b", "a"), c("b", "a", "c"))

## End(Not run)


brandon-mosqueda/SKM documentation built on Feb. 8, 2025, 5:24 p.m.