KLD | R Documentation |
Calculates the Kullback-Leibler divergence for certain fitted model objects
KLD(object, ...)
KLDvglm(object, ...)
object |
Some VGAM object, for example, having
class |
... |
Other possible arguments fed into
|
The Kullback-Leibler divergence (KLD),
or relative entropy,
is a measure of how one probability distribution differs
from a second reference probability distribution.
Currently the VGAM package computes the KLD
for GAITD regression models
(e.g., see gaitdpoisson
and
gaitdnbinomial
) where the reference distribution
is the (unscaled) parent or base distribution.
For such, the formula for the KLD simplifies somewhat.
Hence one can obtain a quantitative measure for the overall
effect of altering, inflating, truncating and deflating certain
(special) values.
Returns a numeric nonnegative value with the corresponding KLD. A 0 value means no difference between an ordinary parent or base distribution.
Numerical problems might occur if any of the evaluated probabilities of the unscaled parent distribution are very close to 0.
T. W. Yee.
Kullback, S. and Leibler, R. A. (1951). On information and sufficiency. Annals of Mathematical Statistics, 22, 79–86.
M'Kendrick, A. G. (1925). Applications of mathematics to medical problems. Proc. Edinb. Math. Soc., 44, 98–130.
gaitdpoisson
,
gaitdnbinomial
.
# McKendrick (1925): Data from 223 Indian village households
cholera <- data.frame(ncases = 0:4, # Number of cholera cases,
wfreq = c(168, 32, 16, 6, 1)) # Frequencies
fit7 <- vglm(ncases ~ 1, gaitdpoisson(i.mlm = 0, ilambda.p = 1),
weight = wfreq, data = cholera, trace = TRUE)
coef(fit7, matrix = TRUE)
KLD(fit7)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.