Description Usage Arguments Value Author(s) References Examples
Bayesian Estimation of Shannon Entropy by Yuan and Kesevan (1997)
1 | disc1997YK(counts, alpha = 1)
|
counts |
count information. We allow three types of inputs as below:
|
alpha |
concentration parameter number or vector. If a single number is provided, we consider it as α_1=α_2=…=α_K=α. If a vector is provided, it must have the same length as counts. |
estimated entropy value in natural logarithm.
Kisung You
yuan_bayesian_1997T4entropy
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 | ## ----------------------------------------- ##
## basic usage with three types of inputs
## ----------------------------------------- ##
# Sample 50 numbers from 1 to 5 uniformly.
xx = sample(1:5, 50, replace=TRUE)
# Create histogram, table, and count vector
hx = hist(xx, breaks=seq(from=0.5,to=5.5,by=1), plot=FALSE)
tx = table(xx)
cx = round(as.double(tx))
# compare three objects
line1 = paste0("entropy from histogram : ",round(disc1997YK(hx),4))
line2 = paste0("entropy from table : ",round(disc1997YK(tx),4))
line3 = paste0("entropy from counts : ",round(disc1997YK(cx),4))
cat("\n",line1,"\n",line2,"\n",line3)
## Not run:
## ----------------------------------------- ##
## effect of concentration parameter 'alpha'
## ----------------------------------------- ##
# we consider scalar-valued alphas
vec.alpha = 10^(seq(from=-2,to=2,length.out=100))
vec.entmu = rep(0,100)
# for each alpha, repeat it 496 times and use table input
for (i in 1:100){
enti = rep(0,496)
for (j in 1:496){
tx = table(sample(1:5, 100, replace=TRUE))
enti[j] = disc1997YK(tx, alpha=vec.alpha[i])
}
vec.entmu[i] = base::mean(enti)
print(paste0("* iteration ",i,"/100 complete..."))
}
# visualize
opar <- par(no.readonly=TRUE)
plot(vec.alpha, vec.entmu, lwd=2, type="b", cex=0.5,
xlab="alpha", ylab="entropy", main="Bayesian Estimate of Entropy")
par(opar)
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.