Description Usage Arguments Value See Also
Functions for information measures of and between distributions of values.
Warning!
Functions will check if .data
if a distribution of random variable (sum == 1) or not.
To force normalisation and / or to prevent this, set .do.norm
to TRUE (do normalisation)
or FALSE (don't do normalisation). For js.div
and kl.div
vectors of values must have
equal length.
Functions:
- The Shannon entropy quantifies the uncertainty (entropy or degree of surprise) associated with this prediction.
- Kullback-Leibler divergence (information gain, information divergence, relative entropy, KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q (measure of information lost when Q is used to approximate P).
- Jensen-Shannon divergence is a symmetric version of KLIC. Square root of this is a metric often referred to as Jensen-Shannon distance.
1 2 3 4 5 |
.data, .alpha, .beta |
Vector of values. |
.norm |
if T then compute normalised entropy (H / Hmax). |
.do.norm |
One of the three values - NA, T or F. If NA than check for distrubution |
.laplace |
Value for Laplace correction which will be added to every value in the .data. |
.norm.entropy |
if T then normalise JS-divergence by entropy. |
Shannon entropy, Jensen-Shannon divergence or Kullback-Leibler divergence values.
similarity, diversity
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.