View source: R/confidence_score_evaluation.R
confidence_score_evaluation | R Documentation |
Evaluate the performance of the confidence scores generated by one or more aggregation methods. Assumes probabilistic confidence scores for the metrics selected.
confidence_score_evaluation(confidence_scores, outcomes)
confidence_scores |
A dataframe in the format output by the |
outcomes |
A dataframe with two columns: |
Evaluated dataframe with four columns: method
(character variable describing the aggregation method),
AUC
(Area Under the Curve (AUC) scores of ROC curves - see ?precrec::auc
), Brier_Score
(see
?DescTools::BrierScore
) and Classification_Accuracy
(classification accuracy measured for pcc =
percent correctly classified; see ?MLmetrics::Accuracy
).
confidence_score_evaluation(data_confidence_scores,
data_outcomes)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.