binaryPRC: Precision-recall curve for binary label prediction

View source: R/assessment.R

binaryPRCR Documentation

Precision-recall curve for binary label prediction

Description

Precision-recall curve for binary label prediction

Usage

binaryPRC(
  scores,
  labels,
  cutoff = NULL,
  cut_direction = ">=",
  add_cut1 = FALSE,
  empty_precision = 1
)

Arguments

scores

Prediction score for each sample

labels

True labels for each sample, e.g., from simulation

cutoff

A vector of cutoffs; if NULL use all unique scores

cut_direction

A string to compare with cutoff: >=, >, <=, <

add_cut1

Logical value; if True, manually add a cutoff of 1

empty_precision

Float value for default precision if no any recall

Value

A data.frame containing recall and precision values at various cutoffs.

Examples

scores <- 1:10
labels <- c(0, 0, 0, 1, 0, 1, 0, 1, 1, 1)
binaryPRC(scores, labels)

# Extra arguments. 
binaryPRC(scores, labels, cutoff = seq(1, 10, by = 2))
binaryPRC(scores, labels, cut_direction = ">")
binaryPRC(scores, labels, add_cut1 = TRUE)


PMBio/cardelino documentation built on Nov. 21, 2022, 4:52 a.m.