View source: R/F1_ComputeKLDs.R
ComputeKLDs | R Documentation |
Compute signed and symmetric Kullback-Leibler divergence of variables over a spectrum of evidence
ComputeKLDs( tree, var0, vars, seq, pbar = TRUE, method = "gaussian", epsilon = 10^-6 )
tree |
a |
var0 |
the variable to have evidence absrobed |
vars |
the variables to have divergence computed |
seq |
a |
pbar |
|
method |
method for divergence computation:
|
epsilon |
|
Compute signed and symmetric Kullback-Leibler divergence of variables over a spectrum of evidence. The signed and symmetric Kullback-Leibler divergence is also known as Jeffery's signed information (JSI) for continuous variables.
a data.frame
of the divergence
Han Yu
Cowell, R. G. (2005). Local propagation in conditional Gaussian Bayesian networks.
Journal of Machine Learning Research, 6(Sep), 1517-1550.
Yu H, Moharil J, Blair RH (2020). BayesNetBP: An R Package for Probabilistic Reasoning in Bayesian
Networks. Journal of Statistical Software, 94(3), 1-31. <doi:10.18637/jss.v094.i03>.
## Not run: data(liver) tree.init.p <- Initializer(dag=liver$dag, data=liver$data, node.class=liver$node.class, propagate = TRUE) klds <- ComputeKLDs(tree=tree.init.p, var0="Nr1i3", vars=setdiff(tree.init.p@node, "Nr1i3"), seq=seq(-3,3,0.5)) head(klds) ## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.