SplitEntropy | R Documentation |
Calculate the entropy, joint entropy, entropy distance and information content of two splits, treating each split as a division of n leaves into two groups. Further details are available in a vignette, \insertCiteMackay2003;textualTreeDist and \insertCiteMeila2007;textualTreeDist.
SplitEntropy(split1, split2 = split1)
split1 , split2 |
Logical vectors listing leaves in a consistent order,
identifying each leaf as a member of the ingroup ( |
A numeric vector listing, in bits:
H1
The entropy of split 1;
H2
The entropy of split 2;
H12
The joint entropy of both splits;
I
The mutual information of the splits;
Hd
The entropy distance (variation of information) of the splits.
Martin R. Smith (martin.smith@durham.ac.uk)
Other information functions:
SplitSharedInformation()
,
TreeInfo
A <- TRUE
B <- FALSE
SplitEntropy(c(A, A, A, B, B, B), c(A, A, B, B, B, B))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.