accuracy: Confusion matrix and overall accuracy of predicted binary...

Description Usage Arguments Details Value Author(s) See Also Examples

View source: R/accuracy.R

Description

Takes in actual binary response, predicted probabilities and cutoff value, and returns confusion matrix and overall accuracy

Usage

1
accuracy(y, yhat, cutoff)

Arguments

y

actual binary response variable

yhat

predicted probabilities corresponding to the actual binary response

cutoff

threshold value in the range 0 to 1

Details

When we predict a binary response, first thing that we want to check is accuracy of the model for a particular cutoff value. This function does just that and provides confusion matrix (numbers and percentage) and overall accuracy. Overall accuracy is calculated as (TP + TN)/(P + N).

The output is a list from which the individual elements can be picked as shown in the example.

Value

a three element list: confusion matrix as a table, confusion matrix (percentages) as a table and overall accuracy value

Author(s)

Akash Jain

See Also

ks, auc, iv, splitdata

Examples

1
2
3
4
5
6
7
8
9
# A 'data.frame' with y and yhat
df <- data.frame(y = c(1, 0, 1, 1, 0),
                 yhat = c(0.86, 0.23, 0.65, 0.92, 0.37))

# Accuracy tables and overall accuracy figures
ltAccuracy <- accuracy(y = df[, 'y'], yhat = df[, 'yhat'], cutoff = 0.7)
accuracyNumber <- ltAccuracy$accuracyNum
accuracyPercentage <- ltAccuracy$accuracyPer
overallAccuracy <- ltAccuracy$overallAcc

Example output



StatMeasures documentation built on May 2, 2019, 1:44 p.m.