Description Usage Arguments Details Value Author(s) References See Also
View source: R/parEntropyEstimate.R
A function that computes the entropy between all pairs of rows (or
specified ones) of matrix counts
using the indirect methods.
1 2 3 | parEntropyEstimate(idx, method = method, unit = unit,
priorHyperParam = priorHyperParam,
shrinkageTarget = shrinkageTarget, boot = boot)
|
idx |
the index of the cell which corresponds to the interaction going to be esimated. |
method |
a character string indicating which estimate is to be computed.
One of |
unit |
the unit in which mutual information is measured. One of |
priorHyperParam |
the prior distribution type for the Bayes estimation. One of |
shrinkageTarget |
shrinkage target frequencies. If not specified (default) it is estimated in a James-Stein-type fashion (uniform distribution). |
boot |
logical ( |
Internal of parMIEstimate
.
The parEntropyEstimate
function returns the value of the entropy of
that pair of genes H(X,Y)
.
Luciano Garofano lucianogarofano88@gmail.com, Stefano Maria Pagnotta, Michele Ceccarelli
Paniski L. (2003). Estimation of Entropy and Mutual Information. Neural Computation, vol. 15 no. 6 pp. 1191-1253.
Meyer P.E., Laffitte F., Bontempi G. (2008). minet: A R/Bioconductor Package for Inferring Large Transcriptional Networks Using Mutual Information. BMC Bioinformatics 9:461.
Antos A., Kontoyiannis I. (2001). Convergence properties of functional estimates for discrete distributions. Random Structures and Algorithms, vol. 19 pp. 163-193.
Strong S., Koberle R., de Ruyter van Steveninck R.R., Bialek W. (1998). Entropy and Information in Neural Spike Trains. Physical Review Letters, vol. 80 pp. 197-202.
Miller G.A. (1955). Note on the bias of information estimates. Information Theory in Psychology, II-B pp. 95-100.
Jeffreys H. (1946). An invariant form for the prior probability in estimation problems. Proceedings of the Royal Society of London, vol. 186 no. 1007 pp. 453-461.
Krichevsky R.E., Trofimov V.K. (1981). The performance of universal encoding. IEEE Transactions on Information Theory, vol. 27 pp. 199-207.
Holste D., Hertzel H. (1998). Bayes' estimators of generalized entropies. Journal of Physics A, vol. 31 pp. 2551-2566.
Perks W. (1947). Some observations on inverse probability including a new indifference rule. Journal of the Institute of Actuaries, vol. 73 pp. 285-334.
Schurmann T., Grassberg P. (1996). Entropy estimation of symbol sequences. Chaos, vol. 6 pp. 414-427.
Trybula S. (1958). Some problems of simultaneous minimax estimation. The Annals of Mathematical Statistics, vol. 29 pp. 245-253.
Chao A., Shen T.J. (2003). Nonparametric estimation of Shannon's index diversity when there are unseen species. Environmental and Ecological Statistics, vol. 10 pp. 429-443.
James W., Stein C. (1961). Estimation with Quadratic Loss. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, vol. 1 pp. 361-379.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.