Description Usage Arguments Value References Examples
Measuring accuracy of predicting categories, where in the predictPair paradigm the categories are the relative ranks of a pair of rows. The categories are: -1 means Row1 < Row2 0 means the rows are equal or guess 1 means Row1 > Row2
1 | confusionMatrixFor_Neg1_0_1(ref_data, predicted_data)
|
ref_data |
A vector with outcome categories from a reference source to be predicted (e.g. the output of correctGreater.) |
predicted_data |
A vector with outcome categories from a prediction source that is trying to match ref_data (e.g. ttbModel predictions). |
A 3x3 matrix of counts. Rows are outcomes of the reference data. Columns are outcomes of predicted data.
Wikipedia's entry on https://en.wikipedia.org/wiki/Confusion_matrix.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | # Example 1
# Below, the correct outcome is always 1, so only the last row of the
# confusion matrix has non-zero counts. But the predictor makes a few
# mistakes, so some non-zero counts are off the diagonal.
confusionMatrixFor_Neg1_0_1(c(1,1,1), c(1,-1,-1))
# outputs:
# -1 0 1
# -1 0 0 0
# 0 0 0 0
# 1 2 0 1
#
# Example 2
# The prediction always matches the reference outcome, so all non-zero
# counts are on the diagonal.
confusionMatrixFor_Neg1_0_1(c(1,1,0,0,-1,-1), c(1,1,0,0,-1,-1))
# outputs:
# -1 0 1
# -1 2 0 0
# 0 0 2 0
# 1 0 0 2
#
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.