View source: R/calculateROCMeasures.R
| calculateROCMeasures | R Documentation |
Calculate the absolute number of correct/incorrect classifications and the following evaluation measures:
tpr True positive rate (Sensitivity, Recall)
fpr False positive rate (Fall-out)
fnr False negative rate (Miss rate)
tnr True negative rate (Specificity)
ppv Positive predictive value (Precision)
for False omission rate
lrp Positive likelihood ratio (LR+)
fdr False discovery rate
npv Negative predictive value
acc Accuracy
lrm Negative likelihood ratio (LR-)
dor Diagnostic odds ratio
For details on the used measures see measures and also https://en.wikipedia.org/wiki/Receiver_operating_characteristic.
The element for the false omission rate in the resulting object is not called for but
fomr since for should never be used as a variable name in an object.
calculateROCMeasures(pred)
## S3 method for class 'ROCMeasures'
print(x, abbreviations = TRUE, digits = 2, ...)
pred |
(Prediction) |
x |
( |
abbreviations |
( |
digits |
( |
... |
|
(ROCMeasures).
A list containing two elements confusion.matrix which is
the 2 times 2 confusion matrix of absolute frequencies and measures, a list of the above mentioned measures.
print(ROCMeasures):
Other roc:
asROCRPrediction()
Other performance:
ConfusionMatrix,
calculateConfusionMatrix(),
estimateRelativeOverfitting(),
makeCostMeasure(),
makeCustomResampledMeasure(),
makeMeasure(),
measures,
performance(),
setAggregation(),
setMeasurePars()
lrn = makeLearner("classif.rpart", predict.type = "prob")
fit = train(lrn, sonar.task)
pred = predict(fit, task = sonar.task)
calculateROCMeasures(pred)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.