Description Usage Arguments Details Value See Also Examples
Returns the best binary threshold rule based on accuracy.
1 | accuracy_threshold(x, group, pos_class)
|
x |
a numeric predictor. |
group |
binary grouping variable. |
pos_class |
group level for positive class. |
Optimizes threshold rule based on misclassification rate (1 - accuracy). Upper rules are defined as greater than cut, and lower is less than or equal to cut.
misclass rate of misclassification.
cut selected threshold.
direction indicator for positive group direction.
max_fp_threshold, score_threshold
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 | ### General Use: ###
set.seed(123)
x <- c(rnorm(100,0,1),rnorm(100,2,1))
group <- c(rep(0,100),rep(2,100))
accuracy_threshold(x=x, group=group, pos_class=2)
accuracy_threshold(x=x, group=group, pos_class=0)
### Bagged Example ###
set.seed(123)
replicate_function = function(index){accuracy_threshold(x=x[index], group=group[index], pos_class=2)$cut}
sample_cuts = replicate(100, {
sample_index = sample.int(n=length(x),replace=TRUE)
replicate_function(index=sample_index)
})
bagged_scores = sapply(x, function(x) mean(x > sample_cuts))
unbagged_cut = accuracy_threshold(x=x, group=group, pos_class=2)$cut
unbagged_scores = ifelse(x > unbagged_cut, 1, 0)
# Compare AUC:
PRROC::roc.curve(scores.class0 = bagged_scores,weights.class0 = ifelse(group==2,1,0))$auc
PRROC::roc.curve(scores.class0 = unbagged_scores,weights.class0 = ifelse(group==2,1,0))$auc
bagged_prediction = ifelse(bagged_scores > 0.50, 2, 0)
unbagged_prediction = ifelse(x > unbagged_cut, 2, 0)
# Compare Confusion Matrix:
table(bagged_prediction, group)
table(unbagged_prediction, group)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.