cross_entropy: Cross entropy

View source: R/deepMetrics.r

cross_entropyR Documentation

Cross entropy

Description

Cross entropy

Usage

cross_entropy(p, q, base = NULL)

categorical_crossentropy(target, output, axis = -1)

Arguments

p

A vector of ground truth probabilities (true probability distribution).

q

A vector of estimated probabilities (estimated probability distribution).

base

A positive or complex number: the base with respect to which logarithms are computed. Defaults to NULL is equal to e = exp(1).

Details

Cross entropy quantifies the difference between two probability distributions. For a binary classification problem, the following equation can be used instead: -sum((p * log(q)) + ((1 - p) * (1 - log(q))))

Value

Cross entropy.

See Also

Other Metrics: accuracy(), dice(), entropy(), erf(), erfc(), erfcinv(), erfinv(), gini_impurity(), huber_loss(), iou(), log_cosh_loss(), mae(), mape(), mse(), msle(), quantile_loss(), rmse(), rmsle(), rmspe(), sse(), stderror(), vc(), wape(), wmape()

Examples

  # multiclass classification
  # each element represents the probability of the k-th class (k = 1,...,3)
  p <- c(0.10, 0.40, 0.50) # ground truth values
  q <- c(0.80, 0.15, 0.05) # estimated values, e.g. given by softmax function
  cross_entropy(p, q)

  # binary classification
  # the complementary probability is (1 - probability)
  p <- c(1)   # ground truth value
  q <- c(0.8) # estimated value
  cross_entropy(p, q)

stschn/deepANN documentation built on June 25, 2024, 7:27 a.m.