entropy: Information measures

Description Usage Arguments Value Examples

View source: R/info_theory.R

Description

Compute information-based estimates and distances.

Usage

1
2
3
4
5
6
7
8
entropy(.data, .base = 2, .norm = FALSE, .do.norm = NA, .laplace = 1e-12)

kl_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12)

js_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12, .norm.entropy = FALSE)

cross_entropy(.alpha, .beta, .base = 2, .do.norm = NA,
              .laplace = 1e-12, .norm.entropy = FALSE)

Arguments

.data

Numeric vector. Any distribution.

.base

Numeric. A base of logarithm.

.norm

Logical. If TRUE then normalise the entropy by the maximal value of the entropy.

.do.norm

If TRUE then normalise input distributions to make them sum up to 1.

.laplace

Numeric. A value for the laplace correction.

.alpha

Numeric vector. A distribution of some random value.

.beta

Numeric vector. A distribution of some random value.

.norm.entropy

Logical. If TRUE then normalise the resultant value by the average entropy of input distributions.

Value

A numeric value.

Examples

1
2
3
4
5
6
P <- abs(rnorm(10))
Q <- abs(rnorm(10))
entropy(P)
kl_div(P, Q)
js_div(P, Q)
cross_entropy(P, Q)

abrown435/immunarch-test documentation built on July 29, 2020, 12:04 a.m.