Description Usage Arguments Details Value
Kullback-Leibler divergence between discrete probability distributions KLD(p||q)
1 |
p |
first probability distribution ("to which"), a numeric vector, no NAs |
q |
second probability distribution ("from which"), a numeric vector of same length as |
Assumes that both p and q are normalized (sum(p)=sum(q)=1). Note that by definition the Kullback-Leibler divergence is asymmetric: KLD(p||q) is not equal to KLD(q||p).
KLD(p||q), a numeric value
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.