entropy: Information measures.

Description Usage Arguments Value See Also

Description

Functions for information measures of and between distributions of values.

Warning! Functions will check if .data if a distribution of random variable (sum == 1) or not. To force normalisation and / or to prevent this, set .do.norm to TRUE (do normalisation) or FALSE (don't do normalisation). For js.div and kl.div vectors of values must have equal length.

Functions:

- The Shannon entropy quantifies the uncertainty (entropy or degree of surprise) associated with this prediction.

- Kullback-Leibler divergence (information gain, information divergence, relative entropy, KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q (measure of information lost when Q is used to approximate P).

- Jensen-Shannon divergence is a symmetric version of KLIC. Square root of this is a metric often referred to as Jensen-Shannon distance.

Usage

1
2
3
4
5
entropy(.data, .norm = F, .do.norm = NA, .laplace = 1e-12)

kl.div(.alpha, .beta, .do.norm = NA, .laplace = 1e-12)

js.div(.alpha, .beta, .do.norm = NA, .laplace = 1e-12, .norm.entropy = F)

Arguments

.data, .alpha, .beta

Vector of values.

.norm

if T then compute normalised entropy (H / Hmax).

.do.norm

One of the three values - NA, T or F. If NA than check for distrubution (sum(.data) == 1). and normalise if needed with the given laplace correction value. if T then do normalisation and laplace correction. If F than don't do normalisaton and laplace correction.

.laplace

Value for Laplace correction which will be added to every value in the .data.

.norm.entropy

if T then normalise JS-divergence by entropy.

Value

Shannon entropy, Jensen-Shannon divergence or Kullback-Leibler divergence values.

See Also

similarity, diversity


tcR documentation built on July 2, 2020, 3:18 a.m.