confusionMatrixFor_Neg1_0_1: Confusion matrix for categories -1, 0, 1 (the output of...

View source: R/performance.R

confusionMatrixFor_Neg1_0_1R Documentation

Confusion matrix for categories -1, 0, 1 (the output of predictPair).

Description

Measuring accuracy of predicting categories, where in the predictPair paradigm the categories are the relative ranks of a pair of rows. The categories are: -1 means Row1 < Row2 0 means the rows are equal or guess 1 means Row1 > Row2

Usage

confusionMatrixFor_Neg1_0_1(ref_data, predicted_data)

Arguments

ref_data

A vector with outcome categories from a reference source to be predicted (e.g. the output of correctGreater.)

predicted_data

A vector with outcome categories from a prediction source that is trying to match ref_data (e.g. ttbModel predictions).

Value

A 3x3 matrix of counts. Rows are outcomes of the reference data. Columns are outcomes of predicted data.

References

Wikipedia's entry on https://en.wikipedia.org/wiki/Confusion_matrix.

Examples

# Example 1
# Below, the correct outcome is always 1, so only the last row of the
# confusion matrix has non-zero counts.  But the predictor makes a few
# mistakes, so some non-zero counts are off the diagonal.
confusionMatrixFor_Neg1_0_1(c(1,1,1), c(1,-1,-1))
# outputs:
#    -1 0 1
# -1  0 0 0
# 0   0 0 0
# 1   2 0 1
#
# Example 2
# The prediction always matches the reference outcome, so all non-zero
# counts are on the diagonal.
confusionMatrixFor_Neg1_0_1(c(1,1,0,0,-1,-1), c(1,1,0,0,-1,-1))
# outputs:
#    -1 0 1
# -1  2 0 0
# 0   0 2 0
# 1   0 0 2
#

jeanimal/heuristica documentation built on Feb. 3, 2024, 9:56 p.m.