auc | R Documentation |
Get the Area under the ROC curve to assess classifier performance.
auc(
preds,
labels,
method = c("pROC", "ROCR", "auc_pairs"),
verbose = FALSE,
trace = 0
)
preds |
Numeric, Vector: Probabilities or model scores (e.g. c(.32, .75, .63), etc) |
labels |
True labels of outcomes (e.g. c(0, 1, 1)) |
method |
Character: "pROC", "auc_pairs", or "ROCR": Method to use.
Will use |
verbose |
Logical: If TRUE, print messages to output |
trace |
Integer: If > 0, print more messages to output |
Important Note: We assume that true labels are a factor where the first level is the "positive" case, a.k.a. the event. All methods used here, "pROC", "auc_pairs", "ROCR", have been setup to expect this. This goes against the default setting for both "pROC" and "ROCR", which will not give an AUC less than .5 because they will reorder levels. We don't want this because you can have a classifier perform worse than .5 and it can be very confusing if levels are reordered automatically and different functions give you different AUC.
EDG
## Not run:
preds <- c(0.7, 0.55, 0.45, 0.25, 0.6, 0.7, 0.2)
labels <- factor(c("a", "a", "a", "b", "b", "b", "b"))
auc(preds, labels, method = "ROCR")
auc(preds, labels, method = "pROC")
auc(preds, labels, method = "auc_pairs")
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.