comp.metr: Compute metrics

Description Usage Arguments Details Value See Also Examples

Description

A group of functions to plot precision-recall and ROC curves and to compute f-scores from the matrix returned by the evaluate function.

Usage

1
2
3
4
5

Arguments

table

This is the matrix returned by the evaluate function where columns contain the confusion matrix TP,FP,TN,FN values. - see evaluate.

beta

Numeric used as the weight of the recall in the f-score formula - see details. The default value of this argument is -1, meaning precision as important as recall.

k

Numeric used as the index to compute the area under the curve until that point- see details. The default value of this argument is -1, meaning that the whole area under the curve is computed

device

The device to be used. This parameter allows the user to plot precision-recall and receiver operating characteristic curves for various inference algorithms on the same plotting window - see examples.

...

Arguments passed to plot.

Details

A confusion matrix contains FP,TP,FN,FP values.

Value

The function roc.plot (pr.plot) plots the ROC-curve (PR-curve) and returns the device associated with the plotting window.

The function auroc (aupr) computes the area under the ROC-curve (PR-curve) using the trapezoidal approximation until point k.

The function fscore returns fscores according to the confusion matrices contained in the 'table' argument - see details.

See Also

evaluate, plot

Examples

1
2
3
4
5
6
7
8
9
    # Inference
    Net <- cor(syntren300.data)
    # Validation
    tbl <-  evaluate(Net,syntren300.net)
    # Plot PR-Curves
    max(fscore(tbl))
    dev <- pr.plot(tbl, col="green", type="l")
    aupr(tbl)
    idx <- which.max(fscore(tbl))

netbenchmark documentation built on May 2, 2019, 6:08 p.m.