Description Usage Arguments Details Value
Kullback-Leibler divergence
| 1 | 
| x, y | Numeric vectors representing probabilities | 
Kullback-Leibler divergence is a non-symmetric measure of difference between two probability vectors. In general, KL(x, y) is not equal to KL(y, x).
Because this measure is defined for probabilities, the vectors x and y are normalized in the function so they sum to 1.
The Kullback-Leibler divergence between x and y. We
adopt the following conventions if elements of x or y are
zero: 0 \log (0 / y_i) = 0, 0 \log (0 / 0) = 0, and
x_i \log (x_i / 0) = ∞. As a result, if elements of x are
zero, they do not contribute to the sum. If elements of y are zero
where x is nonzero, the result will be Inf. If either
x or y sum to zero, we are not able to compute the
proportions, and we return NaN.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.