MolecularMI: Molecular Mutual Information

Description Usage Arguments Value Author(s) See Also Examples

Description

Mutual information (MI) represents the interdependence of two discrete random variables. Thus MI quantifies the reduction in uncertainty of one variable given the knowledge of a second variable. Placing entropy values on the diagonal of a MI matrix forms a structure comparable to a covariance matrix appropriate for variability decomposition. MI identifies pairs of statistically dependent or coupled sites where MI=1 indicates complete coupling.

Usage

1
MolecularMI(x, type, normalized)

Arguments

x

matrix, vector, or list of aligned DNA or Amino Acid sequences. If matrix, rows must be sequences and columns individual characters of the alignment. vector and list structures will be coerced into this format.

type

"DNA", "AA", or "GroupAA" method for calculating and normalizing the entropy value for each column (site)

normalized

method of normalization. If "NULL" or not provided, MI[i,j] = H(x[i])+H(x[j])-H(x[i],x[j]) for i,j=1..n where n is the number of sites. Otherwise, MI is normalized by some leveling constant. see NMI

Value

nxn matrix of mutual information values (DNA, AA, GroupAA), where n is the number of sites in the alignment. The diagonal contains the entropy values for that site.

Author(s)

Lisa McFerrin

See Also

MolecularEntropy, NMI

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
data(bHLH288)
bHLH_Seq = bHLH288[,2]
bHLH.MIAA = MolecularMI(bHLH_Seq, "AA")
bHLH.MIFG = MolecularMI(bHLH_Seq, "GroupAA")

##Compare Entropy values
MolecularEntropy(bHLH_Seq, "AA")$H
diag(bHLH.MIAA)
diag(bHLH.MIFG)

plot(diag(bHLH.MIFG), type = "h", ylab="Functional Entropy", xlab="site")

HDMD documentation built on May 1, 2019, 8:48 p.m.