Spark Online Training by Edureka

KL.divergence: Kullback-Leibler Divergence

Description Usage Arguments Details Value Author(s) References See Also Examples

Description

Compute Kullback-Leibler divergence.

Usage

1
2
  KL.divergence(X, Y, k = 10, algorithm=c("kd_tree", "cover_tree", "brute"))
  KLx.divergence(X, Y, k = 10, algorithm="kd_tree")

Arguments

X

An input data matrix.

Y

An input data matrix.

k

The maximum number of nearest neighbors to search. The default value is set to 10.

algorithm

nearest neighbor search algorithm.

Details

If p(x) and q(x) are two continuous probability density functions, then the Kullback-Leibler divergence of q from p is defined as E_p[log p(x)/q(x)].

KL.* versions return divergences from C code to R but KLx.* do not.

Value

Return the Kullback-Leibler divergence from X to Y.

Author(s)

Shengqiao Li. To report any bugs or suggestions please email: shli@stat.wvu.edu.

References

S. Boltz, E. Debreuve and M. Barlaud (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking”. Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on.

S. Boltz, E. Debreuve and M. Barlaud (2009). “High-dimensional statistical measure for region-of-interest tracking”. Trans. Img. Proc., 18:6, 1266–1283.

See Also

KL.dist

Examples

1
2
3
4
5
6
    set.seed(1000)
    X<- rexp(10000, rate=0.2)
    Y<- rexp(10000, rate=0.4)

    KL.divergence(X, Y, k=5)
    #theoretical divergence = log(0.2/0.4)+(0.4-0.2)-1 = 1-log(2) = 0.307


Search within the FNN package
Search all R packages, documentation and source code

Questions? Problems? Suggestions? or email at ian@mutexlabs.com.

Please suggest features or report bugs with the GitHub issue tracker.

All documentation is copyright its authors; we didn't write any of that.