NMI: Normalized Mutual information estimator

Description Usage Arguments Details Value Examples

Description

Function used to estimate the Normalized Mutual Information

Usage

1
NMI(imA, imB, binsA = 10, binsB = 10)

Arguments

imA, imB

cimg objects

binsA, binsB

numerical values indicatin the number of discrete bins for each image

Details

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as bits) obtained about one random variable, through the other random variable. The concept of mutual information is intricately linked to that of entropy of a random variable, a fundamental notion in information theory, that defines the "amount of information" held in a random variable.

Value

the NMI value for imA and imB

Examples

1
2
3
4
5
6
7
8
imA <- ground[[1]]
imB <- ground[[2]]
imC <- ground[[20]]
NMI(imA, imB)
NMI(imA, imC)
entropy <- sapply(ground, NMI, imB = ground[[1]])
plot(entropy)
lines(entropy)

coldfir3/surf documentation built on May 13, 2019, 8:49 p.m.