get.auprc: get.auprc

View source: R/ranger_RF_util.R

get.auprcR Documentation

get.auprc

Description

calculates the area under Precision-recall curve (AUPRC).

Usage

get.auprc(predictor, y, positive_class)

Arguments

predictor

A predictor. For example, one column in the probability table, indicating the probability of a given observation being the positive class in the factor y.

y

A binary factor vector indicates observed values.

positive_class

A class of the factor y.

Details

It’s a bit trickier to interpret AUPRC than it is to interpret AUROC. The AUPRC of a random classifier is equal to the fraction of positives (Saito et al.), where the fraction of positives is calculated as (# positive examples / total # examples). That means that _different_ classes have _different_ AUPRC baselines. A class with 12 has a baseline AUPRC of 0.12, so obtaining an AUPRC of 0.40 on this class is great. However a class with 98 of 0.40 on this class is bad.

Value

auprc

Author(s)

Shi Huang

Examples

y<-factor(c(rep("A", 10), rep("B", 50)))
y1<-factor(c(rep("A", 18), rep("B", 20), rep("C", 22)))
y0<-factor(rep("A", 60))
pred <- c(runif(10, 0.4, 0.9), runif(50, 0, 0.6))
prob <-data.frame(A=pred, B=1-pred)
positive_class="A"
get.auprc(predictor=prob[, positive_class], y, positive_class="A")
get.auprc(predictor=prob[, positive_class], y, positive_class="B")
get.auprc(predictor=prob[, positive_class], y0, positive_class="A")
get.auprc(predictor=prob[, positive_class], y1, positive_class="A")

shihuang047/crossRanger documentation built on Feb. 7, 2023, 10:03 p.m.