View source: R/comp_prob_prob.R

comp_acc | R Documentation |

`comp_acc`

computes overall accuracy `acc`

from 3 essential probabilities
`prev`

, `sens`

, and `spec`

.

comp_acc(prev, sens, spec)

`prev` |
The condition's prevalence |

`sens` |
The decision's sensitivity |

`spec` |
The decision's specificity value |

`comp_acc`

uses probabilities (not frequencies) as
inputs and returns an exact probability (proportion)
without rounding.

Understanding the probability `acc`

:

Definition:

`acc`

is the (non-conditional) probability:`acc = p(dec_cor) = dec_cor/N`

or the base rate (or baseline probability) of a decision being correct, but not necessarily positive.

`acc`

values range from 0 (no correct decision/prediction) to 1 (perfect decision/prediction).Computation:

`acc`

can be computed in 2 ways:(a) from

`prob`

:`acc = (prev x sens) + [(1 - prev) x spec]`

(b) from

`freq`

:`acc = dec_cor/N = (hi + cr)/(hi + mi + fa + cr)`

When frequencies in

`freq`

are not rounded, (b) coincides with (a).Perspective:

`acc`

classifies a population of`N`

individuals by accuracy/correspondence (`acc = dec_cor/N`

).`acc`

is the "by accuracy" or "by correspondence" counterpart to`prev`

(which adopts a "by condition" perspective) and to`ppod`

(which adopts a "by decision" perspective).Alternative names of

`acc`

: base rate of correct decisions, non-erroneous casesIn terms of frequencies,

`acc`

is the ratio of`dec_cor`

(i.e.,`hi + cr`

) divided by`N`

(i.e.,`hi + mi`

+`fa + cr`

):`acc = dec_cor/N = (hi + cr)/(hi + mi + fa + cr)`

Dependencies:

`acc`

is a feature of both the environment (true condition) and of the decision process or diagnostic procedure. It reflects the correspondence of decisions to conditions.

See `accu`

for other accuracy metrics
and several possible interpretations of accuracy.

Overall accuracy `acc`

as a probability (proportion).
A warning is provided for NaN values.

See `acc`

for definition
and `accu`

for other accuracy metrics.
`comp_accu_freq`

and `comp_accu_prob`

compute accuracy metrics from frequencies and probabilities.

`acc`

defines accuracy as a probability;
`accu`

lists all accuracy metrics;
`comp_accu_prob`

computes exact accuracy metrics from probabilities;
`comp_accu_freq`

computes accuracy metrics from frequencies;
`comp_sens`

and `comp_PPV`

compute related probabilities;
`is_extreme_prob_set`

verifies extreme cases;
`comp_complement`

computes a probability's complement;
`is_complement`

verifies probability complements;
`comp_prob`

computes current probability information;
`prob`

contains current probability information;
`is_prob`

verifies probabilities.

Other functions computing probabilities:
`comp_FDR()`

,
`comp_FOR()`

,
`comp_NPV()`

,
`comp_PPV()`

,
`comp_accu_freq()`

,
`comp_accu_prob()`

,
`comp_comp_pair()`

,
`comp_complement()`

,
`comp_complete_prob_set()`

,
`comp_err()`

,
`comp_fart()`

,
`comp_mirt()`

,
`comp_ppod()`

,
`comp_prob_freq()`

,
`comp_prob()`

,
`comp_sens()`

,
`comp_spec()`

Other metrics:
`accu`

,
`acc`

,
`comp_accu_freq()`

,
`comp_accu_prob()`

,
`comp_err()`

,
`err`

# ways to work: comp_acc(.10, .200, .300) # => acc = 0.29 comp_acc(.50, .333, .666) # => acc = 0.4995 # watch out for vectors: prev.range <- seq(0, 1, by = .1) comp_acc(prev.range, .5, .5) # => 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 # watch out for extreme values: comp_acc(1, 1, 1) # => 1 comp_acc(1, 1, 0) # => 1 comp_acc(1, 0, 1) # => 0 comp_acc(1, 0, 0) # => 0 comp_acc(0, 1, 1) # => 1 comp_acc(0, 1, 0) # => 0 comp_acc(0, 0, 1) # => 1 comp_acc(0, 0, 0) # => 0

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.