KL.GBN: KL Divergence for 'GBN'

View source: R/KL.R

KL.GBNR Documentation

KL Divergence for GBN

Description

KL.GBN returns the Kullback-Leibler (KL) divergence between an object of class GBN and its update after a standard parameter variation.

Usage

## S3 method for class 'GBN'
KL(x, where, entry, delta, ...)

Arguments

x

object of class GBN.

where

character string: either mean or covariance for variations of the mean vector and covariance matrix respectively.

entry

if where == "mean", entry is the index of the entry of the mean vector to vary. If where == "covariance", entry is a vector of length 2 indicating the entry of the covariance matrix to vary.

delta

numeric vector, including the variation parameters that act additively.

...

additional arguments for compatibility.

Details

Computation of the KL divergence between a Bayesian network and the additively perturbed Bayesian network, where the perturbation is either to the mean vector or to the covariance matrix.

Value

A dataframe including in the first column the variations performed and in the second column the corresponding KL divergences.

References

Gómez-Villegas, M. A., Maín, P., & Susi, R. (2007). Sensitivity analysis in Gaussian Bayesian networks using a divergence measure. Communications in Statistics—Theory and Methods, 36(3), 523-539.

Gómez-Villegas, M. A., Main, P., & Susi, R. (2013). The effect of block parameter perturbations in Gaussian Bayesian networks: Sensitivity and robustness. Information Sciences, 222, 439-458.

See Also

KL.CI, Fro.CI, Fro.GBN, Jeffreys.GBN, Jeffreys.CI

Examples

KL(synthetic_gbn,"mean",2,seq(-1,1,0.1))
KL(synthetic_gbn,"covariance",c(3,3),seq(-1,1,0.1))


bnmonitor documentation built on June 7, 2023, 5:19 p.m.