entropy: Shannon entropy

View source: R/deepMetrics.r

entropyR Documentation

Shannon entropy

Description

Shannon entropy

Usage

entropy(x, base = NULL)

Arguments

x

A vector of values, usually character labels as raw instances or as class frequencies.

base

A positive or complex number: the base with respect to which logarithms are computed. Defaults to NULL indicates that the base is automatically determined by the number of class levels of x.

Details

Shannon entropy is a concept from information theory and represents a quantification of the level of impurity or randomness that exists within a partition with class levels of x.

Value

Entropy.

See Also

Other Metrics: accuracy(), cross_entropy(), dice(), erf(), erfc(), erfcinv(), erfinv(), gini_impurity(), huber_loss(), iou(), log_cosh_loss(), mae(), mape(), mse(), msle(), quantile_loss(), rmse(), rmsle(), rmspe(), sse(), stderror(), vc(), wape(), wmape()

Examples

  entropy(c("no", "no", "yes", "yes", "yes", "no", "yes", "no", "yes", "yes", "yes", "yes", "yes", "no"))
  entropy(c("no" = 5, "yes" = 9))

stschn/deepANN documentation built on June 25, 2024, 7:27 a.m.