Description Usage Arguments Details Value Examples
Returns model accuracy from a confusion matrix derived from
confusion.matrix
1 | model.acc(conf.mat)
|
conf.mat |
a confusion matrix generated by |
Return accuracy scores for each mode, all expressed as percentages. There are many names for different accuracy scores, and some scores are preferential in certain situations, see https://en.wikipedia.org/wiki/Sensitivity_and_specificity
Scores output:
1. PPV, Positive predictive value (aka Precision): the percentage of points which the model predicts as each mode which are also observed as that mode
2. Sensitivity (aka Recall, Detection rate, True Positive rate): the ratio of true positives to true positives and false negatives
3. NPV: Negative predictive value: The ratio of true negatives and false negatives
4. Specificity (aka True Negative Rate): The ratio of true negatives to true negatives and false positives
5. Accuracy: The sum of true positives and negatives divided by the sum of true and false negatives and positives
6. F1 score: The harmonic mean of PPV and sensitivity
A seven column data.frame, with accuracy scores per mode
1 2 3 4 5 6 7 8 9 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.