cross_entropy | R Documentation |
Cross entropy
cross_entropy(p, q, base = NULL)
categorical_crossentropy(target, output, axis = -1)
p |
A vector of ground truth probabilities (true probability distribution). |
q |
A vector of estimated probabilities (estimated probability distribution). |
base |
A positive or complex number: the base with respect to which logarithms are computed.
Defaults to |
Cross entropy quantifies the difference between two probability distributions.
For a binary classification problem, the following equation can be used instead:
-sum((p * log(q)) + ((1 - p) * (1 - log(q))))
Cross entropy.
Other Metrics:
accuracy()
,
dice()
,
entropy()
,
erf()
,
erfc()
,
erfcinv()
,
erfinv()
,
gini_impurity()
,
huber_loss()
,
iou()
,
log_cosh_loss()
,
mae()
,
mape()
,
mse()
,
msle()
,
quantile_loss()
,
rmse()
,
rmsle()
,
rmspe()
,
sse()
,
stderror()
,
vc()
,
wape()
,
wmape()
# multiclass classification
# each element represents the probability of the k-th class (k = 1,...,3)
p <- c(0.10, 0.40, 0.50) # ground truth values
q <- c(0.80, 0.15, 0.05) # estimated values, e.g. given by softmax function
cross_entropy(p, q)
# binary classification
# the complementary probability is (1 - probability)
p <- c(1) # ground truth value
q <- c(0.8) # estimated value
cross_entropy(p, q)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.