Description Usage Arguments Details Value References See Also Examples
The function RelativeEntropy
is used to compute the relative entropy between two probability distributions.
1 | RelativeEntropy(p, q, group.index = NULL)
|
p |
a numeric vector representing a probability distribution. |
q |
a numeric vector representing a probability distribution. |
group.index |
if provided, the relative entropy will be decomposed according to the chain rule (see below for more details). The default is |
Relative entropy can be thought of as a measure of distance between two probability distributions. It is also known as the Kullback-Leibler divergence and is usually denoted by H(p|q). It is not a metric as it is not symmetric and it does not satisfy the triangle inequality.
If there is an index i
where q[i] == 0
but p[i] > 0
, then the relative entropy is Inf
. Mathematically, this happens when p
is not absolutely continuous with respect to q
.
If group.index
is provided the relative entropy will be decompoesd using the chain rule stated in Lemma 3.1(i) of Pal and Wong (2013), see equation (23) there. In this case the output has 1 + 1 + m
components, where m
is the number of groups defined by group.index
. The first component is the left-hand-side of (23). The second component is the first term on the right-hand-side of (23). The other m
components are the terms in the sum on the right-hand-side of (23).
A non-negative number or +Inf
if group.index
is not given. A numeric vector if group.index
is given.
Pal, S. and T.-K. L. Wong (2013). Energy, entropy, and arbitrage. arXiv preprint arXiv:1308.5376.
1 2 3 4 5 | p <- c(0.3, 0.3, 0.4)
q <- c(0.5, 0.3, 0.2)
RelativeEntropy(p, q)
RelativeEntropy(q, p) # relative entropy is not symmetric
|
Loading required package: zoo
Attaching package: 'zoo'
The following objects are masked from 'package:base':
as.Date, as.Date.numeric
[1] 0.1240112
[1] 0.1167834
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.