entknn | R Documentation |
Estimating entropy from data with kNN method.
entknn(x,k=3,dt=2)
x |
the data with each row as a sample |
k |
kth nearest neighbour, default = 3 |
dt |
the type of distance between samples, = 1 for Eclidean distance; other for Maximum distance |
This program involves estimating entropy from data by kNN method. It was proposed in Kraskov et al (2004). The algorithm is the second step of estimating copula entropy copent
.
The argument x is for the data with each row as a sample from random variables. The argument k and dt is used in the kNN method for estimating entropy. k is for the kth nearest neighbour (default = 3) and dt is for the type of distance between samples which has currently two value options (1 for Eclidean distance, and 2(default) for Maximum distance).
The function returns the estimated entropy value of data x.
Kraskov, A., St\"ogbauer, H., & Grassberger, P. (2004). Estimating Mutual Information. Physical Review E, 69(6), 66138.
library(mnormt)
rho <- 0.5
sigma <- matrix(c(1,rho,rho,1),2,2)
x <- rmnorm(500,c(0,0),sigma)
xent1 <- entknn(x)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.