mtr_cohen_kappa: Cohen’s Kappa

Description Usage Arguments Value Author(s) References Examples

View source: R/kappa.r

Description

mtr_cohen_kappa computes Kappa statistic which measures inter-rater agreement for categorical items.

The form of Kappa statistic is: Kappa = (O - E) / (1 - E) where O is the observed accuracy and E is the expected accuracy based on the marginal total of the confusion matrix. The statistic can take on values between -1 and 1; a value of 0 means there is no agreement between the actual and the predicted, while a value of 1 indicates perfect concordance of the model prediction and the observed classes.

Usage

1
mtr_cohen_kappa(actual, predicted, cutoff = 0.5)

Arguments

actual

[numeric] Ground truth binary numeric vector containing 1 for the positive class and 0 for the negative class.

predicted

[numeric] A vector of estimated probabilities.

cutoff

[numeric] A cutoff value for predicted vector which classify a sample into a given class. Default value is 0.5

Value

A numeric scalar output

Author(s)

An Chu

References

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
act <- c(1, 0, 1, 0, 1)
pred <- c(0.1, 0.9, 0.3, 0.5, 0.2)
mtr_cohen_kappa(act, pred)


set.seed(2093)
pred <- runif(1000)
act <- round(pred)
pred[sample(1000, 300)] <- runif(300) # noises
mtr_cohen_kappa(act, pred)

chuvanan/metrics documentation built on Nov. 4, 2019, 8:52 a.m.