Description Usage Arguments Value Examples
Mutual information is a Kullback-Leibler type of measure for directed divergence. It is a positive number varying from 0 to log2(C), where C is the number of clusters. Higher values indicate less divergence, hence better clustering results. Please note that MI is a not suitable for k-means clustering validation in highly imbalanced datasets because it cannot capture the inbalance. MI is equivalent to the entropy measure.
1 | ev.mi(x, y)
|
x |
A vector with cluster assignments. |
y |
A vector with cluster assignments. |
A positive number.
1 2 3 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.