disc1997YK: Bayesian Estimation of Shannon Entropy by Yuan and Kesevan...

Description Usage Arguments Value Author(s) References Examples

View source: R/disc1997YK.R

Description

Bayesian Estimation of Shannon Entropy by Yuan and Kesevan (1997)

Usage

1
disc1997YK(counts, alpha = 1)

Arguments

counts

count information. We allow three types of inputs as below:

vector

a vector of length K where entries are counts (n_1,n_2,…,n_K) for each alphabet.

histogram

an object of class histogram with K+1 breaks.

table

an object of class table of length K.

alpha

concentration parameter number or vector. If a single number is provided, we consider it as α_1=α_2=…=α_K=α. If a vector is provided, it must have the same length as counts.

Value

estimated entropy value in natural logarithm.

Author(s)

Kisung You

References

\insertRef

yuan_bayesian_1997T4entropy

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
## ----------------------------------------- ##
##  basic usage with three types of inputs
## ----------------------------------------- ##

#  Sample 50 numbers from 1 to 5 uniformly.
xx = sample(1:5, 50, replace=TRUE)

#  Create histogram, table, and count vector
hx = hist(xx, breaks=seq(from=0.5,to=5.5,by=1), plot=FALSE)
tx = table(xx)
cx = round(as.double(tx))

# compare three objects
line1 = paste0("entropy from histogram : ",round(disc1997YK(hx),4))
line2 = paste0("entropy from table     : ",round(disc1997YK(tx),4))
line3 = paste0("entropy from counts    : ",round(disc1997YK(cx),4))
cat("\n",line1,"\n",line2,"\n",line3)

## Not run: 
## ----------------------------------------- ##
## effect of concentration parameter 'alpha'
## ----------------------------------------- ##

# we consider scalar-valued alphas
vec.alpha = 10^(seq(from=-2,to=2,length.out=100))
vec.entmu = rep(0,100)

# for each alpha, repeat it 496 times and use table input
for (i in 1:100){
  enti = rep(0,496)
  for (j in 1:496){
    tx = table(sample(1:5, 100, replace=TRUE))
    enti[j] = disc1997YK(tx, alpha=vec.alpha[i])
  }
  vec.entmu[i] = base::mean(enti)
  print(paste0("* iteration ",i,"/100 complete..."))
}

# visualize
opar <- par(no.readonly=TRUE)
plot(vec.alpha, vec.entmu, lwd=2, type="b", cex=0.5,
     xlab="alpha", ylab="entropy", main="Bayesian Estimate of Entropy")
par(opar)

## End(Not run)

kyoustat/T4entropy documentation built on March 6, 2020, 12:56 a.m.