Description Usage Arguments Details Value See Also Examples
A group of functions to plot precision-recall and ROC curves and
to compute f-scores from the matrix returned by the
evaluate
function.
1 2 3 4 5 |
table |
This is the matrix returned by the |
beta |
Numeric used as the weight of the recall in the f-score formula - see details. The default value of this argument is -1, meaning precision as important as recall. |
k |
Numeric used as the index to compute the area under the curve until that point- see details. The default value of this argument is -1, meaning that the whole area under the curve is computed |
device |
The device to be used. This parameter allows the user to plot precision-recall and receiver operating characteristic curves for various inference algorithms on the same plotting window - see examples. |
... |
Arguments passed to |
A confusion matrix contains FP,TP,FN,FP values.
"true positive rate" tpr = TP/(TN+TP)
"false positive rate" fpr = FP/(FN+FP)
"precision" p = TP/(FP+TP)
"recall" r = TP/(TP+FN)
"f-beta-score" F_β = (1+β) \frac{p r} {r + β p} Fbeta = (1+beta) * p*r/(r + beta*p)
The function roc.plot
(pr.plot
) plots the ROC-curve
(PR-curve) and returns the device associated with the plotting
window.
The function auroc
(aupr
) computes the area under
the ROC-curve (PR-curve) using the trapezoidal approximation
until point k.
The function fscore
returns fscores according to the
confusion matrices contained in the 'table' argument - see details.
1 2 3 4 5 6 7 8 9 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.