Description Usage Arguments Details Value Author(s) See Also Examples
This function computes the mutual information (in bits) for a given channel. It optimally computes this over a source distribution given by its second argument. This can be used to compute the information rate for a channel that is optimized for one source distribution, but used to communicate values drawn from a different distribution.
1 | MutualInformation(Q, px = NA)
|
Q |
An information channel, as returned by |
px |
Optional. If specified, this should be a vector of probabilities that sums to 1. The length of the vector should equal the number of symbols in the source alphabet. |
This algorithm works by direct implementation of the equation for mutual information. It can run slowly for channels with large source or destination alphabets.
A single numeric value, corresponding to the information rate (in bits) for the channel and given source distribution.
Chris R. Sims
BlahutAlgorithm
, FindOptimalChannel
, FindRate
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 | # Define a discretized Gaussian information source
x <- seq(from = -10, to = 10, length.out = 100)
Px <- dnorm(x, mean = 0, sd = 3)
Px <- Px / sum(Px) # Ensure that probability sums to 1
y <- x # The destination alphabet is the same as the source
# Define a quadratic cost function
cost.function <- function(x, y) {
(y - x)^2
}
# Slope of the rate-distortion curve
s <- -1
# Compute the rate-distortion value at the given point s
channel <- BlahutAlgorithm(x, Px, y, cost.function, s)
# Compute the information rate for this channel assuming a different (uniform) source distribution
uniform.dist <- rep(1 / 100, 100)
MutualInformation(channel, uniform.dist)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.