Description Usage Arguments Details Value Examples

This function calculates the Precision, Recall and AUC of multi-class classifications.

1 |

`data` |
A data frame contain true labels of multiple groups and corresponding predictive scores |

`force_diag` |
If TRUE, TPR and FPR will be forced to across (0, 0) and (1, 1) |

A data frame is required for this function as input. This data frame should contains true label (0 - Negative, 1 - Positive) columns named as XX_true (e.g. S1_true, S2_true and S3_true) and predictive scores (continuous) columns named as XX_pred_YY (e.g. S1_pred_SVM, S2_pred_RF), thus this function allows calcluating ROC on mulitiple classifiers.

Predictive scores could be probabilities among [0, 1] and other continuous values. For each classifier, the number of columns should be equal to the number of groups of true labels. The order of columns won't affect results.

Recall, Precision, AUC for each group and each method will be calculated. Macro/Micro-average AUC for all groups and each method will be calculated.

Micro-average ROC/AUC was calculated by stacking all groups together, thus converting the multi-class classification into binary classification. Macro-average ROC/AUC was calculated by averaging all groups results (one vs rest) and linear interpolation was used between points of ROC.

AUC will be calculated using function `cal_auc()`

.

`Recall` |
A list of recalls for each group, each method and micro-/macro- average |

`Precision` |
A list of precisions for each group, each method and micro-/macro- average |

`AUC` |
A list of AUCs for each group, each method and micro-/macro- average |

`Methods` |
A vector contains the name of different classifiers |

`Groups` |
A vector contains the name of different groups |

1 2 3 |

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.