acc: Accuracy (acc) is the probability of a correct decision.

Description Usage Format Details References See Also Examples

Description

acc defines overall accuracy as the probability of correspondence between a positive decision and true condition (i.e., the proportion of correct classification decisions or of dec_cor cases).

Usage

1

Format

An object of class numeric of length 1.

Details

Importantly, correct decisions dec_cor are not necessarily positive decisions dec_pos.

Understanding or obtaining the accuracy metric acc:

See accu for other accuracy metrics and several possible interpretations of accuracy.

References

Consult Wikipedia:Accuracy_and_precision for additional information.

See Also

comp_acc computes accuracy from probabilities; accu lists all accuracy metrics; comp_accu_prob computes exact accuracy metrics from probabilities; comp_accu_freq computes accuracy metrics from frequencies; comp_sens and comp_PPV compute related probabilities; is_extreme_prob_set verifies extreme cases; comp_complement computes a probability's complement; is_complement verifies probability complements; comp_prob computes current probability information; prob contains current probability information; is_prob verifies probabilities.

Other probabilities: FDR, FOR, NPV, PPV, err, fart, mirt, ppod, prev, sens, spec

Other metrics: accu, comp_accu_freq, comp_accu_prob, comp_acc, comp_err, err

Examples

1
2
3
acc <- .50     # sets a rate of correct decisions of 50%
acc <- 50/100  # (dec_cor) for 50 out of 100 individuals
is_prob(acc)   # TRUE

riskyr documentation built on Jan. 3, 2019, 1:06 a.m.