divergence.Shannon: Divergencia.Shannon: Mutual Information

Description Usage Arguments Details Author(s) References See Also Examples

View source: R/divergence.Shannon.R

Description

This function calculates Mutual Information (Renyi Order equal 1) by means of Kullback-Leibler divergence

Usage

1
divergence.Shannon(training.set, H, HXY,correction)

Arguments

training.set

A set of aligned nucleotide sequences

H

Entropy

HXY

Joint Entropy

correction

Correction of the Finite Sample Size Effect

Details

Renyi Order has to be equal 1.

Author(s)

Joan Maynou <joan.maynou@upc.edu>

References

J. Maynou, M. Vallverdu, F. Claria, J.J. Gallardo-Chacon, P. Caminal and A. Perera, Transcription Factor Binding Site Detection through Position Cross-Mutual Information variability analysis. 31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

See Also

divergence.Renyi, PredictDivergence, kfold.divergence

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
data(BackgroundOrganism)
data(iicc)
HXmax<-iicc$HXmax
H<-iicc$Entropy[[1]]
q<-1
training.set<-iicc$Transcriptionfactor
correction<-correction.entropy(q,p=nrow(training.set),long=1,iicc)
pmX<-probability(training.set,Prob,missing.fun=TRUE)
prob_parella<-probability.couple(Prob)
pmXY<-joint.probability(training.set, Prob, prob_parella)
HXY<-entropy.joint(pmXY,q,iicc)
divergence.Shannon(training.set,H,HXY,correction)

MEET documentation built on May 2, 2019, 5:52 p.m.