skill_confusionMatrix: Confusion Matrix Statistics

Description Usage Arguments Value References See Also Examples

View source: R/skill_confusionMatrix.R

Description

Measurements of categorical forecast accuracy have a long history in weather forecasting. The standard approach involves making binary classifications (detected/not-detected) of predicted and observed data and combining them in a binary contingency table known as a confusion matrix.

This function creates a confusion matrix from predicted and observed values and calculates a wide range of common statistics including:

Usage

1
2
3
4
5
6
7
skill_confusionMatrix(
  predicted,
  observed,
  FPCost = 1,
  FNCost = 1,
  lightweight = FALSE
)

Arguments

predicted

logical vector of predicted values

observed

logical vector of observed values

FPCost

cost associated with false positives (type I error)

FNCost

cost associated with false negatives (type II error)

lightweight

flag specifying creation of a return list without derived metrics

Value

List containing a table of confusion matrix values and a suite of derived metrics.

References

Simple Guide to Confusion Matrix Terminology

See Also

skill_ROC

skill_ROCPlot

Examples

1
2
3
4
predicted <- sample(c(TRUE,FALSE), 1000, replace=TRUE, prob=c(0.3,0.7))
observed <- sample(c(TRUE,FALSE), 1000, replace=TRUE, prob=c(0.3,0.7))
cm <- skill_confusionMatrix(predicted, observed)
print(cm)

PWFSLSmoke documentation built on July 8, 2020, 7:19 p.m.