least_divergent: Calculate the least divergent elements from a word.

Description Usage Arguments Value References Examples

View source: R/top_relations.R

Description

Kullback-Leibler divergence, or relative entropy, between two vectors is calculated as follows:

D(p,q) = sum( p(x) * log( p(x) / q(x) ) )

While similar to a measurement for distance between vectors, KL-divergence is not a true measurement for distance; D(p,q) != D(q,p).

Usage

1
least_divergent(m, word, num_results = 10)

Arguments

m

A matrix.

word

An elements of the matrix.

num_results

Number of results to display.

Value

A data table of elements in m least divergent from word.

References

Cover, Thomas M., and Joy A. Thomas. Elements of Information Theory. Wiley-Interscience, 1991.

Examples

1
2
least_divergent(W, "loue")
least_divergent(W, "loue", num_results = 20)

ajfabry/Statspeare documentation built on Jan. 26, 2020, 7:44 a.m.