entropyML: Maximum Likelihood Entropy Estimate

Description Usage Arguments Value Author(s) References See Also Examples

Description

Computing the Maximul Likelihood Entropy Estimate of cellCounts.

Usage

1
entropyML(cellCounts, unit = unit)

Arguments

cellCounts

an integer vector (or matrix) representing the number of times each particular count is obtained.

unit

the unit in which entropy is measured. One of "bit" (log2, default), "ban" (log10) or "nat" (natural units).

Value

The entropyML function returns the value of the entropy of that gene H(X) (or pair of genes H(X,Y)).

Author(s)

Luciano Garofano lucianogarofano88@gmail.com, Stefano Maria Pagnotta, Michele Ceccarelli

References

Paniski L. (2003). Estimation of Entropy and Mutual Information. Neural Computation, vol. 15 no. 6 pp. 1191-1253.

Meyer P.E., Laffitte F., Bontempi G. (2008). minet: A R/Bioconductor Package for Inferring Large Transcriptional Networks Using Mutual Information. BMC Bioinformatics 9:461.

Antos A., Kontoyiannis I. (2001). Convergence properties of functional estimates for discrete distributions. Random Structures and Algorithms, vol. 19 pp. 163-193.

Strong S., Koberle R., de Ruyter van Steveninck R.R., Bialek W. (1998). Entropy and Information in Neural Spike Trains. Physical Review Letters, vol. 80 pp. 197-202.

See Also

entropyMM, entropyBayes, entropyCS, entropyShrink

Examples

1
2
3
4
simData <- simulatedData(p = 50, n = 100, mu = 100, sigma = 0.25,
                        ppower = 0.73, noise = FALSE)
cellCounts <- table(simData$counts[1, ])
eML <- entropyML(cellCounts, unit = "nat")

lucgar/synRNASeqNet documentation built on May 21, 2019, 8:54 a.m.