Description Usage Arguments Details Author(s) References See Also Examples
View source: R/divergence.Shannon.R
This function calculates Mutual Information (Renyi Order equal 1) by means of Kullback-Leibler divergence
1 | divergence.Shannon(training.set, H, HXY,correction)
|
training.set |
A set of aligned nucleotide sequences |
H |
Entropy |
HXY |
Joint Entropy |
correction |
Correction of the Finite Sample Size Effect |
Renyi Order has to be equal 1.
Joan Maynou <joan.maynou@upc.edu>
J. Maynou, M. Vallverdu, F. Claria, J.J. Gallardo-Chacon, P. Caminal and A. Perera, Transcription Factor Binding Site Detection through Position Cross-Mutual Information variability analysis. 31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society.
divergence.Renyi, PredictDivergence, kfold.divergence
1 2 3 4 5 6 7 8 9 10 11 12 | data(BackgroundOrganism)
data(iicc)
HXmax<-iicc$HXmax
H<-iicc$Entropy[[1]]
q<-1
training.set<-iicc$Transcriptionfactor
correction<-correction.entropy(q,p=nrow(training.set),long=1,iicc)
pmX<-probability(training.set,Prob,missing.fun=TRUE)
prob_parella<-probability.couple(Prob)
pmXY<-joint.probability(training.set, Prob, prob_parella)
HXY<-entropy.joint(pmXY,q,iicc)
divergence.Shannon(training.set,H,HXY,correction)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.