model.acc: Model accuracy

Description Usage Arguments Details Value Examples

View source: R/model.acc.R

Description

Returns model accuracy from a confusion matrix derived from confusion.matrix

Usage

1
model.acc(conf.mat)

Arguments

conf.mat

a confusion matrix generated by confusion.matrix

Details

Return accuracy scores for each mode, all expressed as percentages. There are many names for different accuracy scores, and some scores are preferential in certain situations, see https://en.wikipedia.org/wiki/Sensitivity_and_specificity

Scores output:

1. PPV, Positive predictive value (aka Precision): the percentage of points which the model predicts as each mode which are also observed as that mode

2. Sensitivity (aka Recall, Detection rate, True Positive rate): the ratio of true positives to true positives and false negatives

3. NPV: Negative predictive value: The ratio of true negatives and false negatives

4. Specificity (aka True Negative Rate): The ratio of true negatives to true negatives and false positives

5. Accuracy: The sum of true positives and negatives divided by the sum of true and false negatives and positives

6. F1 score: The harmonic mean of PPV and sensitivity

Value

A seven column data.frame, with accuracy scores per mode

Examples

1
2
3
4
5
6
7
8
9
pred<-numeric(10)+1
pred[5:6]<-2
pred<-factor(pred,labels=c("Mode1","Mode2"))

obs<-numeric(10)+1
obs[6:8]<-2
obs<-factor(obs,labels=c("Mode1","Mode2"))

model.acc(confusion.matrix(pred,obs))

dprocter/modeid documentation built on May 19, 2019, 8:21 a.m.