# KL.dist: Kullback-Leibler Divergence In FNN: Fast Nearest Neighbor Search Algorithms and Applications

## Description

Compute Kullback-Leibler symmetric distance.

## Usage

 ```1 2``` ``` KL.dist(X, Y, k = 10, algorithm=c("kd_tree", "cover_tree", "brute")) KLx.dist(X, Y, k = 10, algorithm="kd_tree") ```

## Arguments

 `X` An input data matrix. `Y` An input data matrix. `k` The maximum number of nearest neighbors to search. The default value is set to 10. `algorithm` nearest neighbor search algorithm.

## Details

Kullback-Leibler distance is the sum of divergence `q(x)` from `p(x)` and `p(x)` from `q(x)` .

`KL.*` versions return distances from `C` code to `R` but `KLx.*` do not.

## Value

Return the Kullback-Leibler distance between `X` and `Y`.

## Author(s)

Shengqiao Li. To report any bugs or suggestions please email: [email protected]

## References

S. Boltz, E. Debreuve and M. Barlaud (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking”. Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on.

S. Boltz, E. Debreuve and M. Barlaud (2009). “High-dimensional statistical measure for region-of-interest tracking”. Trans. Img. Proc., 18:6, 1266–1283.

`KL.divergence`.

## Examples

 ```1 2 3 4 5 6 7 8``` ``` set.seed(1000) X<- rexp(10000, rate=0.2) Y<- rexp(10000, rate=0.4) KL.dist(X, Y, k=5) KLx.dist(X, Y, k=5) #thoretical distance = (0.2-0.4)^2/(0.2*0.4) = 0.5 ```

### Example output

```[1] 0.4843205 0.5201644 0.4883402 0.4861477 0.4919426
[1] 0.4843205 0.5201644 0.4883402 0.4861477 0.4919426
```

FNN documentation built on May 29, 2017, 9:41 a.m.