kldiv: Estimate the Kullback-Leibler divergence between two random...

Description Usage Arguments Details Value Author(s)

View source: R/kldiv.R

Description

Estimate the Kullback-Leibler divergence between two random variates

Usage

1
2
kldiv(x1, x2, nbreaks = 100, minx = min(c(x1, x2)),
maxx = max(c(x1, x2)), small = 0.01)

Arguments

x1

A numeric random variate of draws from the posterior distribution

x2

A numeric random variate of draws from the prior distribution

nbreaks

A single numeric giving how many breaks to break the discrete distribution into

minx

A single numeric giving the lower bound of integration

maxx

A single numeric giving the upper bound of integration

small

small number added to histogram counts to prevent division by zero

Details

Kullback-Leibler divergence is approximated by binning the random variates and calculating the KL-div for discrete distributions.

It is recommended to visually check distribution fits, particularly if the number of random variates is small.

In general this methods will be inaccurate if analysis is performed on too few samples, e.g. <10 000. >100 000 would be ideal.

Value

A helldist object containing approximate Hellinger distances and fitted density kernals. hdist_discEstimate of Hellinger distance using discrete approximation of the distributions hdist_contEstimate of Hellinger distance using continous approximation of distributions

Author(s)

Christopher J. Brown christo.j.brown@gmail.com


cbrown5/BayeSens documentation built on April 26, 2020, 12:40 a.m.