Description Usage Arguments Examples
Provides estimated difference between individual entropy and cross-entropy of two probability distributions.
1 | kld(x, y, bins)
|
x, y |
numeric or discrete data vectors |
bins |
specify number of bins |
1 2 3 4 5 6 7 8 9 |
[1] 0.0746387
[1] 0.2075187
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.