View source: R/score_roc_measures.R
| score_roc_measures | R Documentation |
Calculate a set of roc performance measures based on the confusion matrix.
tpr True positive rate (Sensitivity, Recall)
fpr False positive rate (Fall-out)
fnr False negative rate (Miss rate)
tnr True negative rate (Specificity)
ppv Positive predictive value (Precision)
fomr False omission rate
lrp Positive likelihood ratio (LR+)
fdr False discovery rate
npv Negative predictive value
acc Accuracy
lrm Negative likelihood ratio (LR-)
dor Diagnostic odds ratio
score_roc_measures(pred)
pred |
(PredictionClassif) |
list()
A list containing two elements confusion_matrix which is the 2 times 2 confusion matrix of absolute frequencies and measures, a list of the above mentioned measures.
learner = lrn("classif.rpart", predict_type = "prob")
splits = partition(task = tsk("pima"), ratio = 0.7)
task = tsk("pima")
learner$train(task)
pred = learner$predict(task)
score_roc_measures(pred)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.