auc: Area under the ROC Curve

View source: R/auc.R

aucR Documentation

Area under the ROC Curve

Description

Get the Area under the ROC curve to assess classifier performance.

Usage

auc(
  preds,
  labels,
  method = c("pROC", "ROCR", "auc_pairs"),
  verbose = FALSE,
  trace = 0
)

Arguments

preds

Numeric, Vector: Probabilities or model scores (e.g. c(.32, .75, .63), etc)

labels

True labels of outcomes (e.g. c(0, 1, 1))

method

Character: "pROC", "auc_pairs", or "ROCR": Method to use. Will use pROC::roc, auc_pairs, ROCR::performance, respectively.

verbose

Logical: If TRUE, print messages to output

trace

Integer: If > 0, print more messages to output

Details

Important Note: We assume that true labels are a factor where the first level is the "positive" case, a.k.a. the event. All methods used here, "pROC", "auc_pairs", "ROCR", have been setup to expect this. This goes against the default setting for both "pROC" and "ROCR", which will not give an AUC less than .5 because they will reorder levels. We don't want this because you can have a classifier perform worse than .5 and it can be very confusing if levels are reordered automatically and different functions give you different AUC.

Author(s)

EDG

Examples

## Not run: 
preds <- c(0.7, 0.55, 0.45, 0.25, 0.6, 0.7, 0.2)
labels <- factor(c("a", "a", "a", "b", "b", "b", "b"))
auc(preds, labels, method = "ROCR")
auc(preds, labels, method = "pROC")
auc(preds, labels, method = "auc_pairs")

## End(Not run)

egenn/rtemis documentation built on March 28, 2024, 12:53 p.m.