kld: Kullback-Leibler Divergence

Description Usage Arguments Examples

View source: R/kld.R

Description

Provides estimated difference between individual entropy and cross-entropy of two probability distributions.

Usage

1
kld(x, y, bins)

Arguments

x, y

numeric or discrete data vectors

bins

specify number of bins

Examples

1
2
3
4
5
6
7
8
9
# Sample numeric vector
a <- rnorm(25, 80, 35)
b <- rnorm(25, 90, 35)
mlf::kld(a, b, bins = 2)

# Sample discrete vector
a <- as.factor(c(1,1,2,2))
b <- as.factor(c(1,1,1,2))
mlf::kld(a, b)

Example output

[1] 0.0746387
[1] 0.2075187

mlf documentation built on May 1, 2019, 10:34 p.m.

Related to kld in mlf...