Description Usage Arguments Details Value Author(s) References Examples
View source: R/APFAfunctions.R
The Kullback-Leibler divergence measures the similarity between two APFA models. If the two models are identical then it is zero.
1 | KL(A,B)
|
A |
APFA igraph object |
B |
APFA igraph object |
A and B must be commensurate, i.e., defined on the same variable set. Note that the KL-divergence is not a true distance measure, since it is not not symmetric in A and B. For large APFA the computation of the KL-divergence may be prohibitive in both time and memory.
Returns the KL-divergence.
Smitha Akinakatte and David Edwards
Thollard, F.; Dupont, P. & de la Higuera, C. Probabilistic DFA Inference using Kullback-Leibler Divergence and Minimality 17th International Conference on Machine Learning., 2000, 975-982
1 2 3 4 5 6 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.