kl: Compute the distance between two fitted Bayesian networks

KLR Documentation

Compute the distance between two fitted Bayesian networks


Compute the Kullback-Leibler divergence between two fitted Bayesian networks.


KL(P, Q)


P, Q

two objects of class bn.fit.


KL() returns a numeric value.


KL() only supports discrete (bn.fit.dnet) and Gaussian (bn.fit.gnet) networks. Note that in the case of Gaussian netwoks the divergence can be negative. Regardless of the type of network, if at least one of the two networks is singular the divergence can be +Inf.

If any of the parameters of the two networks are NAs, the divergence will also be NA.


Marco Scutari


## Not run: 
# discrete networks
dag = model2network("[A][C][F][B|A][D|A:C][E|B:F]")
fitted1 = bn.fit(dag, learning.test, method = "mle")
fitted2 = bn.fit(dag, learning.test, method = "bayes", iss = 20)

KL(fitted1, fitted1)
KL(fitted2, fitted2)
KL(fitted1, fitted2)

## End(Not run)

# continuous, singular networks.
dag = model2network("[A][B][E][G][C|A:B][D|B][F|A:D:E:G]")
singular = fitted1 = bn.fit(dag, gaussian.test)
singular$A = list(coef = coef(fitted1[["A"]]) + runif(1), sd = 0)

KL(singular, fitted1)
KL(fitted1, singular)

bnlearn documentation built on May 29, 2024, 5:07 a.m.