KL: Kullback-Leibler divergence for APFA models

Description Usage Arguments Details Value Author(s) References Examples

View source: R/APFAfunctions.R

Description

The Kullback-Leibler divergence measures the similarity between two APFA models. If the two models are identical then it is zero.

Usage

1
KL(A,B)

Arguments

A

APFA igraph object

B

APFA igraph object

Details

A and B must be commensurate, i.e., defined on the same variable set. Note that the KL-divergence is not a true distance measure, since it is not not symmetric in A and B. For large APFA the computation of the KL-divergence may be prohibitive in both time and memory.

Value

Returns the KL-divergence.

Author(s)

Smitha Akinakatte and David Edwards

References

Thollard, F.; Dupont, P. & de la Higuera, C. Probabilistic DFA Inference using Kullback-Leibler Divergence and Minimality 17th International Conference on Machine Learning., 2000, 975-982

Examples

1
2
3
4
5
6
library(gRapfa)
data(Wheeze)
samp <- sample(1:537, 250)
G <- select.IC(Wheeze[samp,])
G.fit  <- fit.APFA(G, Wheeze[-samp,])
kl <- KL(G, G.fit)

gRapfa documentation built on May 2, 2019, 6:54 a.m.