Numeric vectors representing probabilities
Kullback-Leibler divergence is a non-symmetric measure of difference between two probability vectors. In general, KL(x, y) is not equal to KL(y, x).
Because this measure is defined for probabilities, the vectors x and y are normalized in the function so they sum to 1.
The Kullback-Leibler divergence between
adopt the following conventions if elements of
zero: 0 \log (0 / y_i) = 0, 0 \log (0 / 0) = 0, and
x_i \log (x_i / 0) = ∞. As a result, if elements of
zero, they do not contribute to the sum. If elements of
y are zero
x is nonzero, the result will be
Inf. If either
y sum to zero, we are not able to compute the
proportions, and we return
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.