Description Usage Arguments Value References Examples
View source: R/top_relations.R
Kullback-Leibler divergence, or relative entropy, between two vectors is calculated as follows:
D(p,q) = sum( p(x) * log( p(x) / q(x) ) )
While similar to a measurement for distance between vectors, KL-divergence is not a true measurement for distance; D(p,q) != D(q,p).
1 | least_divergent(m, word, num_results = 10)
|
m |
A matrix. |
word |
An elements of the matrix. |
num_results |
Number of results to display. |
A data table of elements in m
least divergent from word
.
Cover, Thomas M., and Joy A. Thomas. Elements of Information Theory. Wiley-Interscience, 1991.
1 2 | least_divergent(W, "loue")
least_divergent(W, "loue", num_results = 20)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.