Description Usage Arguments Details Author(s) See Also Examples
This function calculates Mutual Information (Renyi Order equal 1) by means of Kullback-Leibler divergence
| 1 | divergence.Shannon(training.set, H, HXY,correction)
 | 
| training.set | A set of aligned nucleotide sequences | 
| H | Entropy | 
| HXY | Joint Entropy | 
| correction | Correction of the Finite Sample Size Effect | 
Renyi Order has to be equal 1.
Joan Maynou <joan.maynouatupc.edu>
divergence.Renyi, PredictDivergence, kfold.divergence
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 | require("MEET")
data(TranscriptionFactor)
data(BackgroundOrganism)
data(iicc)
q<-1
training.set<-TranscriptionFactor
correction<-correction.entropy(q,p=nrow(training.set),long=1,iicc)
HXmax<-entropy.Shannon(as.matrix(Prob))
pmX<-probability(training.set,Prob)
Probtrans<-probability.couple(Prob)
H<-entropy.Shannon(pmX)
pmXY<-joint.probability(training.set, Prob, Probtrans)
HXY<-entropy.joint(pmXY,q,iicc)
divergence.Shannon(training.set,H,HXY,correction)
 | 
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.