KL | R Documentation |

Compute the Kullback-Leibler divergence between two fitted Bayesian networks.

```
KL(P, Q)
```

`P` , `Q` |
two objects of class |

`KL()`

returns a numeric value.

`KL()`

only supports discrete (`bn.fit.dnet`

) and Gaussian
(`bn.fit.gnet`

) networks. Note that in the case of Gaussian netwoks
the divergence can be negative. Regardless of the type of network, if at least
one of the two networks is singular the divergence can be `+Inf`

.

If any of the parameters of the two networks are `NA`

s, the divergence
will also be `NA`

.

Marco Scutari

```
## Not run:
# discrete networks
dag = model2network("[A][C][F][B|A][D|A:C][E|B:F]")
fitted1 = bn.fit(dag, learning.test, method = "mle")
fitted2 = bn.fit(dag, learning.test, method = "bayes", iss = 20)
KL(fitted1, fitted1)
KL(fitted2, fitted2)
KL(fitted1, fitted2)
## End(Not run)
# continuous, singular networks.
dag = model2network("[A][B][E][G][C|A:B][D|B][F|A:D:E:G]")
singular = fitted1 = bn.fit(dag, gaussian.test)
singular$A = list(coef = coef(fitted1[["A"]]) + runif(1), sd = 0)
KL(singular, fitted1)
KL(fitted1, singular)
```

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.