MutualInformation: Compute the mutual information for a given channel and source...

Description Usage Arguments Details Value Author(s) See Also Examples

Description

This function computes the mutual information (in bits) for a given channel. It optimally computes this over a source distribution given by its second argument. This can be used to compute the information rate for a channel that is optimized for one source distribution, but used to communicate values drawn from a different distribution.

Usage

1

Arguments

Q

An information channel, as returned by BlahutAlgorithm, FindOptimalChannel, or FindRate.

px

Optional. If specified, this should be a vector of probabilities that sums to 1. The length of the vector should equal the number of symbols in the source alphabet.

Details

This algorithm works by direct implementation of the equation for mutual information. It can run slowly for channels with large source or destination alphabets.

Value

A single numeric value, corresponding to the information rate (in bits) for the channel and given source distribution.

Author(s)

Chris R. Sims

See Also

BlahutAlgorithm, FindOptimalChannel, FindRate

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
# Define a discretized Gaussian information source
x <- seq(from = -10, to = 10, length.out = 100)
Px <- dnorm(x, mean = 0, sd = 3)
Px <- Px / sum(Px) # Ensure that probability sums to 1
y <- x # The destination alphabet is the same as the source

# Define a quadratic cost function
cost.function <- function(x, y) {
    (y - x)^2
}

# Slope of the rate-distortion curve
s <- -1

# Compute the rate-distortion value at the given point s
channel <- BlahutAlgorithm(x, Px, y, cost.function, s)

# Compute the information rate for this channel assuming a different (uniform) source distribution
uniform.dist <- rep(1 / 100, 100)
MutualInformation(channel, uniform.dist)

RateDistortion documentation built on May 1, 2019, 9:52 p.m.