divergence.Shannon: Divergencia.Shannon: Mutual Information

Description Usage Arguments Details Author(s) See Also Examples

Description

This function calculates Mutual Information (Renyi Order equal 1) by means of Kullback-Leibler divergence

Usage

1
divergence.Shannon(training.set, H, HXY,correction)

Arguments

training.set

A set of aligned nucleotide sequences

H

Entropy

HXY

Joint Entropy

correction

Correction of the Finite Sample Size Effect

Details

Renyi Order has to be equal 1.

Author(s)

Joan Maynou <joan.maynouatupc.edu>

See Also

divergence.Renyi, PredictDivergence, kfold.divergence

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
require("MEET")
data(TranscriptionFactor)
data(BackgroundOrganism)
data(iicc)
q<-1
training.set<-TranscriptionFactor
correction<-correction.entropy(q,p=nrow(training.set),long=1,iicc)
HXmax<-entropy.Shannon(as.matrix(Prob))
pmX<-probability(training.set,Prob)
Probtrans<-probability.couple(Prob)
H<-entropy.Shannon(pmX)
pmXY<-joint.probability(training.set, Prob, Probtrans)
HXY<-entropy.joint(pmXY,q,iicc)
divergence.Shannon(training.set,H,HXY,correction)

MEET documentation built on May 2, 2019, 1:45 p.m.