tmca_fscore: Classification evaluation scores F1, Cohen's kappa,...

Description Usage Arguments Value Examples

Description

Classification evaluation scores F1, Cohen's kappa, Krippendorff's alpha to compare two label vectors, prediction and truth. Micro average: TP, FP, FN over all category decisions first, then F1 Macro average: F1 over each individual categories first, then average

Usage

1
tmca_fscore(prediction, truth, positive_class = NULL, evaluate_irr = TRUE)

Arguments

prediction

vector (factor) of predicted labels

truth

vector (factor) of true labels

positive_class

label (level) of positive class (if not given, the minority class in true labels is assumed as positive)

evaluate_irr

compute alpha and kappa agreement statistics (requires irr package)

Value

Evaluation metrics: Precision, Recall, Specificity, Accuracy, F1-score, Alpha, Kappa

Examples

1
2
3
truth <- factor(c("P", "N", "N", "N"))
prediction <- factor(c("P", "P", "P", "N"))
tmca_fscore(prediction, truth, positive_class = "P")

tm4ss/tmca.classify documentation built on June 24, 2019, 12:37 p.m.