kl: Compute the distance between two fitted Bayesian networks

information theoretic quantitiesR Documentation

Compute the distance between two fitted Bayesian networks

Description

Compute Shannon's entropy of a fitted Bayesian network and the Kullback-Leibler divergence between two fitted Bayesian networks.

Usage

H(P)
KL(P, Q)

Arguments

P, Q

objects of class bn.fit.

Value

H() and KL() return a single numeric value.

Note

Note that in the case of Gaussian and conditional Gaussian netwoks the divergence can be negative. Regardless of the type of network, if at least one of the two networks is singular the divergence can be infinite.

If any of the parameters of the two networks are NAs, the divergence will also be NA.

Author(s)

Marco Scutari

Examples

## Not run: 
# discrete networks
dag = model2network("[A][C][F][B|A][D|A:C][E|B:F]")
fitted1 = bn.fit(dag, learning.test, method = "mle")
fitted2 = bn.fit(dag, learning.test, method = "bayes", iss = 20)

H(fitted1)
H(fitted2)

KL(fitted1, fitted1)
KL(fitted2, fitted2)
KL(fitted1, fitted2)

## End(Not run)

# continuous, singular networks.
dag = model2network("[A][B][E][G][C|A:B][D|B][F|A:D:E:G]")
singular = fitted1 = bn.fit(dag, gaussian.test)
singular$A = list(coef = coef(fitted1[["A"]]) + runif(1), sd = 0)

H(singular)
H(fitted1)

KL(singular, fitted1)
KL(fitted1, singular)

bnlearn documentation built on Sept. 11, 2024, 8:27 p.m.