ComputeKLDs: Compute signed and symmetric Kullback-Leibler divergence

Description Usage Arguments Details Value Author(s) Examples

View source: R/F1_ComputeKLDs.R

Description

Compute signed and symmetric Kullback-Leibler divergence of variables over a spectrum of evidence

Usage

1
ComputeKLDs(tree, var0, vars, seq, pbar = TRUE, method = "gaussian")

Arguments

tree

a ClusterTree object

var0

the variable to have evidence absrobed

vars

the variables to have divergence computed

seq

a vector of numeric values as the evidences

pbar

logical(1) whether to show progress bar

method

method for divergence computation: gaussian for Gaussian approximation, for Monte Carlo integration

Details

Compute signed and symmetric Kullback-Leibler divergence of variables over a spectrum of evidence. The signed and symmetric Kullback-Leibler divergence is also known as Jeffery's signed information (JSI) for continuous variables.

Value

a data.frame of the divergence

Author(s)

Han Yu

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
## Not run: 
data(liver)
tree.init.p <- Initializer(dag=liver$dag, data=liver$data, 
                           node.class=liver$node.class, 
                           propagate = TRUE)
klds <- ComputeKLDs(tree=tree.init.p, var0="Nr1i3", 
                    vars=setdiff(tree.init.p@node, "Nr1i3"),
                    seq=seq(-3,3,0.5))
head(klds)

## End(Not run)

Yam76/BayesNetBP documentation built on Aug. 23, 2019, 1:23 a.m.