gJSD | R Documentation |
This function computes the Generalized Jensen-Shannon Divergence of a probability matrix.
gJSD(x, unit = "log2", weights = NULL, est.prob = NULL)
x |
a probability matrix. |
unit |
a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations. |
weights |
a numeric vector specifying the weights for each distribution in |
est.prob |
method to estimate probabilities from input count vectors such as non-probability vectors. Default:
|
Function to compute the Generalized Jensen-Shannon Divergence
JSD_{\pi_1,...,\pi_n}(P_1, ..., P_n) = H(\sum_{i = 1}^n \pi_i * P_i) - \sum_{i = 1}^n \pi_i*H(P_i)
where \pi_1,...,\pi_n
denote the weights selected for the probability vectors P_1,...,P_n
and H(P_i)
denotes the Shannon Entropy of probability vector P_i
.
The Jensen-Shannon divergence between all possible combinations of comparisons.
Hajk-Georg Drost
KL
, H
, JSD
,
CE
, JE
# define input probability matrix
Prob <- rbind(1:10/sum(1:10), 20:29/sum(20:29), 30:39/sum(30:39))
# compute the Generalized JSD comparing the PS probability matrix
gJSD(Prob)
# Generalized Jensen-Shannon Divergence between three vectors using different log bases
gJSD(Prob, unit = "log2") # Default
gJSD(Prob, unit = "log")
gJSD(Prob, unit = "log10")
# Jensen-Shannon Divergence Divergence between count vectors P.count and Q.count
P.count <- 1:10
Q.count <- 20:29
R.count <- 30:39
x.count <- rbind(P.count, Q.count, R.count)
gJSD(x.count, est.prob = "empirical")
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.