confusion.matrix.accuracy.simple: Confusion Matrix Measures - Accuracy

View source: R/confusion.matrix.accuracy.simple.R

confusion.matrix.accuracyR Documentation

Confusion Matrix Measures - Accuracy

Description

Calculate Accuracy, defined as (TP+TN)/(P+N), for a given confusion matrix.

Usage

confusion.matrix.accuracy(confusion.matrix)

confusion.matrix.accuracy.simple(
  true.positive = 0,
  false.positive = 0,
  true.negative = 1,
  false.negative = 0,
  count.positive = true.positive + false.negative,
  count.negative = true.negative + false.positive
)

Arguments

confusion.matrix

Matrix - confusion matrix.

true.positive

Scalar - Cases identified as true positive

false.positive

Scalar - Cases identified as false positive - optional if count.positive is used

true.negative

Scalar - Cases identified as true negative

false.negative

Scalar - Cases identified as false negative - optional if count.negative is used

count.positive

Scalar - Total cases identified as positive - optional if first four parameters are used.

count.negative

Scalar - Total cases identified as negative - optional if first four parameters are used.

Value

A scalar with computed accuracy value.


burrm/lolcat documentation built on July 22, 2022, 6:21 a.m.