Estimates the transfer entropy from one time series to another, where each time series consists of continuous random variables. The transfer entropy is an extension of mutual information which takes into account the direction of information flow, under the assumption that the underlying processes can be described by a Markov model. Two estimation methods are provided. The first calculates transfer entropy as the difference of mutual information. Mutual information is estimated using the Kraskov method, which builds on a nearest-neighbor framework (see package references). The second estimation method estimate transfer entropy via the a generalized correlation sum.
|Author||ANN Library: David Mount, Sunil Arya (see src/ann_1.1.2/Copyright.txt). Transfer Entropy Packge: Ghazaleh Haratinezhad Torbati, Glenn Lawyer.|
|Date of publication||2016-04-26 16:50:27|
|Maintainer||Ghazaleh Haratinezhad Torbati <[email protected]>|
|License||GPL (>= 2) | file LICENSE|
|Package repository||View on CRAN|
Install the latest version of this package by entering the following in R:
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.