binaryROC: ROC curve for binary label prediction

View source: R/assessment.R

binaryROCR Documentation

ROC curve for binary label prediction

Description

ROC curve for binary label prediction

Usage

binaryROC(
  scores,
  labels,
  cutoff = NULL,
  cut_direction = ">=",
  add_cut1 = TRUE,
  cutoff_point = 0.9
)

Arguments

scores

Prediction score for each sample

labels

True labels for each sample, e.g., from simulation

cutoff

A vector of cutoffs; if NULL use all unique scores

cut_direction

A string to compare with cutoff: >=, >, <=, <

add_cut1

Logical value; if True, manually add a cutoff of 1

cutoff_point

Numeric value; additional cutoff value

Value

A data.frame containing AUC and AUPRC at various cutoffs.

Examples

scores <- 1:10
labels <- c(0, 0, 0, 1, 0, 1, 0, 1, 1, 1)
binaryROC(scores, labels)

# Extra arguments. 
binaryROC(scores, labels, cutoff = seq(1, 10, by = 2))
binaryROC(scores, labels, cut_direction = ">")
binaryROC(scores, labels, add_cut1 = TRUE)


davismcc/cardelino documentation built on Nov. 19, 2022, 2:44 a.m.