optimalCutoff: optimalCutoff

Description Usage Arguments Details Value Author(s) Examples

Description

Compute the optimal probability cutoff score, based on a user defined objective.

Usage

1
2
optimalCutoff(actuals, predictedScores, optimiseFor = "misclasserror",
  returnDiagnostics = FALSE)

Arguments

actuals

The actual binary flags for the response variable. It can take a numeric vector containing values of either 1 or 0, where 1 represents the 'Good' or 'Events' while 0 represents 'Bad' or 'Non-Events'.

predictedScores

The prediction probability scores for each observation. If your classification model gives the 1/0 predcitions, convert it to a numeric vector of 1's and 0's.

optimiseFor

The maximization criterion for which probability cutoff score needs to be optimised. Can take either of following values: "Ones" or "Zeros" or "Both" or "misclasserror"(default). If "Ones" is used, 'optimalCutoff' will be chosen to maximise detection of "One's". If 'Both' is specified, the probability cut-off that gives maximum Youden's Index is chosen. If 'misclasserror' is specified, the probability cut-off that gives minimum mis-clasification error is chosen.

returnDiagnostics

If TRUE, would return additional diagnostics such as 'sensitivityTable', 'misclassificationError', 'TPR', 'FPR' and 'specificity' for the chosen cut-off.

Details

Compute the optimal probability cutoff score for a given set of actuals and predicted probability scores, based on a user defined objective, which is specified by optimiseFor = "Ones" or "Zeros" or "Both" (default).

Value

The optimal probability score cutoff that maximises a given criterion. If 'returnDiagnostics' is TRUE, then the following items are returned in a list:

Author(s)

Selva Prabhakaran selva86@gmail.com

Examples

1
2
3
data('ActualsAndScores')
optimalCutoff(actuals=ActualsAndScores$Actuals,
predictedScores=ActualsAndScores$PredictedScores, optimiseFor="Both", returnDiagnostics=TRUE)

Example output

$optimalCutoff
[1] 0.6431893

$sensitivityTable
      CUTOFF        FPR        TPR YOUDENSINDEX SPECIFICITY MISCLASSERROR
1  0.7131893          0 0.01176471   0.01176471  1.00000000        0.4941
2  0.7031893          0 0.09411765   0.09411765  1.00000000        0.4529
3  0.6931893          0  0.2235294   0.22352941  1.00000000        0.3882
4  0.6831893          0  0.3294118   0.32941176  1.00000000        0.3353
5  0.6731893 0.04705882  0.4588235   0.41176471  0.95294118        0.2941
6  0.6631893 0.07058824  0.5529412   0.48235294  0.92941176        0.2588
7  0.6531893 0.09411765  0.6705882   0.57647059  0.90588235        0.2118
8  0.6431893  0.1294118  0.7764706   0.64705882  0.87058824        0.1765
9  0.6331893  0.2117647  0.7764706   0.56470588  0.78823529        0.2176
10 0.6231893  0.2470588  0.8235294   0.57647059  0.75294118        0.2118
11 0.6131893  0.3529412  0.8470588   0.49411765  0.64705882        0.2529
12 0.6031893  0.4823529  0.8941176   0.41176471  0.51764706        0.2941
13 0.5931893  0.5529412  0.9294118   0.37647059  0.44705882        0.3118
14 0.5831893        0.6  0.9764706   0.37647059  0.40000000        0.3118
15 0.5731893  0.6588235  0.9882353   0.32941176  0.34117647        0.3353
16 0.5631893  0.7058824  0.9882353   0.28235294  0.29411765        0.3588
17 0.5531893  0.7294118          1   0.27058824  0.27058824        0.3647
18 0.5431893  0.7529412          1   0.24705882  0.24705882        0.3765
19 0.5331893  0.7647059          1   0.23529412  0.23529412        0.3824
20 0.5231893  0.8117647          1   0.18823529  0.18823529        0.4059
21 0.5131893  0.8470588          1   0.15294118  0.15294118        0.4235
22 0.5031893  0.8588235          1   0.14117647  0.14117647        0.4294
23 0.4931893  0.8705882          1   0.12941176  0.12941176        0.4353
24 0.4831893  0.8823529          1   0.11764706  0.11764706        0.4412
25 0.4731893  0.9176471          1   0.08235294  0.08235294        0.4588
26 0.4631893  0.9176471          1   0.08235294  0.08235294        0.4588
27 0.4531893  0.9294118          1   0.07058824  0.07058824        0.4647
28 0.4431893  0.9529412          1   0.04705882  0.04705882        0.4765
29 0.4331893  0.9647059          1   0.03529412  0.03529412        0.4824
30 0.4231893  0.9764706          1   0.02352941  0.02352941        0.4882
31 0.4131893  0.9882353          1   0.01176471  0.01176471        0.4941
32 0.4031893  0.9882353          1   0.01176471  0.01176471        0.4941
33 0.3931893  0.9882353          1   0.01176471  0.01176471        0.4941
34 0.3831893  0.9882353          1   0.01176471  0.01176471        0.4941
35 0.3731893  0.9882353          1   0.01176471  0.01176471        0.4941
36 0.3631893  0.9882353          1   0.01176471  0.01176471        0.4941
37 0.3531893  0.9882353          1   0.01176471  0.01176471        0.4941
38 0.3431893  0.9882353          1   0.01176471  0.01176471        0.4941
39 0.3331893  0.9882353          1   0.01176471  0.01176471        0.4941
40 0.3231893  0.9882353          1   0.01176471  0.01176471        0.4941
41 0.3131893  0.9882353          1   0.01176471  0.01176471        0.4941
42 0.3031893  0.9882353          1   0.01176471  0.01176471        0.4941
43 0.2931893  0.9882353          1   0.01176471  0.01176471        0.4941
44 0.2831893  0.9882353          1   0.01176471  0.01176471        0.4941
45 0.2731893  0.9882353          1   0.01176471  0.01176471        0.4941

$misclassificationError
[1] 0.1765

$TPR
[1] 0.7764706

$FPR
[1] 0.1294118

$Specificity
[1] 0.8705882

InformationValue documentation built on May 1, 2019, 9:12 p.m.