View source: R/KNN.information.measures.R
| KL.divergence | R Documentation |
Compute Kullback-Leibler divergence.
KL.divergence(X, Y, k = 10, algorithm=c("kd_tree", "cover_tree", "brute"))
KLx.divergence(X, Y, k = 10, algorithm="kd_tree")
X |
An input data matrix. |
Y |
An input data matrix. |
k |
The maximum number of nearest neighbors to search. The default value is set to 10. |
algorithm |
nearest neighbor search algorithm. |
If p(x) and q(x) are two continuous probability density functions,
then the Kullback-Leibler divergence of q from p is defined as
E_p[\log \frac{p(x)}{q(x)}].
KL.* versions return divergences from C code to R but KLx.* do not.
Return the Kullback-Leibler divergence from X to Y.
Shengqiao Li. To report any bugs or suggestions please email: lishengqiao@yahoo.com
S. Boltz, E. Debreuve and M. Barlaud (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking”. Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on.
S. Boltz, E. Debreuve and M. Barlaud (2009). “High-dimensional statistical measure for region-of-interest tracking”. Trans. Img. Proc., 18:6, 1266–1283.
KL.dist
set.seed(1000)
X<- rexp(10000, rate=0.2)
Y<- rexp(10000, rate=0.4)
KL.divergence(X, Y, k=5)
#theoretical divergence = log(0.2/0.4)+(0.4/0.2)-1 = 1-log(2) = 0.307
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.