kullback_leibler_divergence: Kullback-Leibler divergence

Description Usage Arguments Details Value

Description

Kullback-Leibler divergence

Usage

1

Arguments

x, y

Numeric vectors representing probabilities

Details

Kullback-Leibler divergence is a non-symmetric measure of difference between two probability vectors. In general, KL(x, y) is not equal to KL(y, x).

Because this measure is defined for probabilities, the vectors x and y are normalized in the function so they sum to 1.

Value

The Kullback-Leibler divergence between x and y. We adopt the following conventions if elements of x or y are zero: 0 \log (0 / y_i) = 0, 0 \log (0 / 0) = 0, and x_i \log (x_i / 0) = ∞. As a result, if elements of x are zero, they do not contribute to the sum. If elements of y are zero where x is nonzero, the result will be Inf. If either x or y sum to zero, we are not able to compute the proportions, and we return NaN.


abdiv documentation built on Jan. 20, 2020, 5:07 p.m.