ent: Entropy

View source: R/ent.R

entR Documentation

Entropy

Description

This function calculates the Entropy given observations of a univariate variable and samples of a predictive distribution.

Usage

ent(y, x, bins = NULL, na.rm = FALSE)

Arguments

y

vector of observations

x

matrix of samples of a predictive distribution (depending on y; see details)

bins

numeric; if NULL the number of bins is equal to ncol(x)+1; otherwise bins must be chosen so that (ncol(x)+1)/bins is an integer; default: NULL (see details)

na.rm

logical; if TRUE NA are stripped before the rank computation proceeds; if FALSE NA are used in the rank computation; default: FALSE

Details

For a vector y of length n, x should be given as matrix with n rows, where the i-th entry of y belongs to the i-th row of x. The columns of x represent the samples of a predictive distribution.

The parameter bins specifies the number of columns for the VRH. For "large" ncol(x) it is often reasonable to reduce the resolution of the VRH by using bins so that (ncol(x)+1)/bins is an integer.

The entropy is a tool to assess the calibration of a forecast. The optimal value of the entropy is 1, representing a calibrated forecast.

Value

Vector of the score value.

Author(s)

David Jobst

References

Tribus, M. (1969). Rational Descriptions, Decisions and Designs. Pergamon Press.

Examples

# simulated data
n <- 30
m <- 50
y <- rnorm(n)
x <- matrix(rnorm(n*m), ncol = m)

# entropy calculation
ent(y = y, x = x, bins = 3)


jobstdavid/eppverification documentation built on May 13, 2024, 5:20 p.m.