Description Usage Arguments Details Value Author(s)
Estimate the Kullback-Leibler divergence between two random variates
1 2 |
x1 |
A |
x2 |
A |
nbreaks |
A single |
minx |
A single |
maxx |
A single |
small |
small number added to histogram counts to prevent division by zero |
Kullback-Leibler divergence is approximated by binning the random variates and calculating the KL-div for discrete distributions.
It is recommended to visually check distribution fits, particularly if the number of random variates is small.
In general this methods will be inaccurate if analysis is performed on too few samples, e.g. <10 000. >100 000 would be ideal.
A helldist object containing approximate Hellinger distances and
fitted density kernals.
hdist_disc
Estimate of Hellinger distance using discrete approximation
of the distributions
hdist_cont
Estimate of Hellinger distance using continous
approximation
of distributions
Christopher J. Brown christo.j.brown@gmail.com
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.