confusion.matrix.accuracy.simple: Confusion Matrix Measures - Accuracy

Description Usage Arguments Value

View source: R/confusion.matrix.accuracy.simple.R

Description

Calculate Accuracy, defined as (TP+TN)/(P+N), for a given confusion matrix.

Usage

1
2
3
4
5
6
7
8
confusion.matrix.accuracy.simple(
  true.positive = 0,
  false.positive = 0,
  true.negative = 1,
  false.negative = 0,
  count.positive = true.positive + false.negative,
  count.negative = true.negative + false.positive
)

Arguments

true.positive

Scalar - Cases identified as true positive

false.positive

Scalar - Cases identified as false positive - optional if count.positive is used

true.negative

Scalar - Cases identified as true negative

false.negative

Scalar - Cases identified as false negative - optional if count.negative is used

count.positive

Scalar - Total cases identified as positive - optional if first four parameters are used.

count.negative

Scalar - Total cases identified as negative - optional if first four parameters are used.

Value

A scalar with computed accuracy value.


burrm/lolcat documentation built on Oct. 13, 2021, 11:35 p.m.