Description Usage Arguments Details Value Examples
Function used to estimate the Normalized Mutual Information
1 | NMI(imA, imB, binsA = 10, binsB = 10)
|
imA, imB |
cimg objects |
binsA, binsB |
numerical values indicatin the number of discrete bins for each image |
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as bits) obtained about one random variable, through the other random variable. The concept of mutual information is intricately linked to that of entropy of a random variable, a fundamental notion in information theory, that defines the "amount of information" held in a random variable.
the NMI value for imA and imB
1 2 3 4 5 6 7 8 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.