acc: Accuracy (acc) is the probability of a correct decision.

accR Documentation

Accuracy (acc) is the probability of a correct decision.

Description

acc defines overall accuracy as the probability of correspondence between a positive decision and true condition (i.e., the proportion of correct classification decisions or of dec_cor cases).

Usage

acc

Format

An object of class numeric of length 1.

Details

Importantly, correct decisions dec_cor are not necessarily positive decisions dec_pos.

Understanding or obtaining the accuracy metric acc:

  • Definition: acc is the (non-conditional) probability:

    acc = p(dec_cor) = dec_cor/N

    or the base rate (or baseline probability) of a decision being correct, but not necessarily positive.

    acc values range from 0 (no correct decision/prediction) to 1 (perfect decision/prediction).

  • Computation: acc can be computed in several ways:

    (a) from prob: acc = (prev x sens) + [(1 - prev) x spec]

    (b) from freq: acc = dec_cor/N = (hi + cr)/(hi + mi + fa + cr)

    (c) as complement of the error rate err: acc = 1 - err

    When frequencies in freq are not rounded, (b) coincides with (a) and (c).

  • Perspective: acc classifies a population of N individuals by accuracy/correspondence (acc = dec_cor/N).

    acc is the "by accuracy" or "by correspondence" counterpart to prev (which adopts a "by condition" perspective) and to ppod (which adopts a "by decision" perspective).

  • Alternative names: base rate of correct decisions, non-erroneous cases

  • In terms of frequencies, acc is the ratio of dec_cor (i.e., hi + cr) divided by N (i.e., hi + mi + fa + cr):

    acc = dec_cor/N = (hi + cr)/(hi + mi + fa + cr)

  • Dependencies: acc is a feature of both the environment (true condition) and of the decision process or diagnostic procedure. It reflects the correspondence of decisions to conditions.

See accu for other accuracy metrics and several possible interpretations of accuracy.

References

Consult Wikipedia:Accuracy_and_precision for additional information.

See Also

comp_acc computes accuracy from probabilities; accu lists all accuracy metrics; comp_accu_prob computes exact accuracy metrics from probabilities; comp_accu_freq computes accuracy metrics from frequencies; comp_sens and comp_PPV compute related probabilities; is_extreme_prob_set verifies extreme cases; comp_complement computes a probability's complement; is_complement verifies probability complements; comp_prob computes current probability information; prob contains current probability information; is_prob verifies probabilities.

Other probabilities: FDR, FOR, NPV, PPV, err, fart, mirt, ppod, prev, sens, spec

Other metrics: accu, comp_accu_freq(), comp_accu_prob(), comp_acc(), comp_err(), err

Examples

acc <- .50     # sets a rate of correct decisions of 50%
acc <- 50/100  # (dec_cor) for 50 out of 100 individuals
is_prob(acc)   # TRUE


riskyr documentation built on Aug. 15, 2022, 9:09 a.m.