# confusionMatrixFor_Neg1_0_1: Confusion matrix for categories -1, 0, 1 (the output of... In heuristica: Heuristics Including Take the Best and Unit-Weight Linear

## Description

Measuring accuracy of predicting categories, where in the predictPair paradigm the categories are the relative ranks of a pair of rows. The categories are: -1 means Row1 < Row2 0 means the rows are equal or guess 1 means Row1 > Row2

## Usage

 `1` ```confusionMatrixFor_Neg1_0_1(ref_data, predicted_data) ```

## Arguments

 `ref_data` A vector with outcome categories from a reference source to be predicted (e.g. the output of correctGreater.) `predicted_data` A vector with outcome categories from a prediction source that is trying to match ref_data (e.g. ttbModel predictions).

## Value

A 3x3 matrix of counts. Rows are outcomes of the reference data. Columns are outcomes of predicted data.

## References

Wikipedia's entry on https://en.wikipedia.org/wiki/Confusion_matrix.

## Examples

 ``` 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21``` ```# Example 1 # Below, the correct outcome is always 1, so only the last row of the # confusion matrix has non-zero counts. But the predictor makes a few # mistakes, so some non-zero counts are off the diagonal. confusionMatrixFor_Neg1_0_1(c(1,1,1), c(1,-1,-1)) # outputs: # -1 0 1 # -1 0 0 0 # 0 0 0 0 # 1 2 0 1 # # Example 2 # The prediction always matches the reference outcome, so all non-zero # counts are on the diagonal. confusionMatrixFor_Neg1_0_1(c(1,1,0,0,-1,-1), c(1,1,0,0,-1,-1)) # outputs: # -1 0 1 # -1 2 0 0 # 0 0 2 0 # 1 0 0 2 # ```

heuristica documentation built on Sept. 8, 2021, 9:08 a.m.