performanceMeasure: Performance metrics to evaluate classification

Description Usage Arguments Details Value See Also Examples

View source: R/performance_metrics.R

Description

Quantify the performance of a classification algorithm. Predictions and truth both have to be binary.

Usage

1
performanceMeasure(pred, obs, perf.method = "f.measure", ...)

Arguments

pred

a logical or numeric, where 0 and FALSE represent control, and, 1 and TRUE represent cases

obs

a logical or numeric, where 0 and FALSE represent control, and, 1 and TRUE represent cases

perf.method

a character, specifying the method to use. Available methods can be accessed using perfMethods

...

additional parameters to methods. see details

Details

The F-measure requires the beta parameter which can be specified using f.beta which defaults to 1 thereby computing the F1-measure.

Value

a numeric, representing the performance

See Also

perfMethods

Examples

1
2
3
4
5
6
pred <- sample(0:1, 100, replace = TRUE, prob = c(0.75, 0.25))
obs <- sample(0:1, 100, replace = TRUE, prob = c(0.75, 0.25))

#compute the F1 and F2 scores
f1 <- performanceMeasure(pred, obs)
f2 <- performanceMeasure(pred, obs, f.beta = 2)

dcanr documentation built on Nov. 8, 2020, 5:48 p.m.