dist_one_one: Distances and Similarities between Two Probability Density...

View source: R/RcppExports.R

dist_one_oneR Documentation

Distances and Similarities between Two Probability Density Functions

Description

This functions computes the distance/dissimilarity between two probability density functions.

Usage

dist_one_one(
  P,
  Q,
  method,
  p = NA_real_,
  testNA = TRUE,
  unit = "log",
  epsilon = 1e-05
)

Arguments

P

a numeric vector storing the first distribution.

Q

a numeric vector storing the second distribution.

method

a character string indicating whether the distance measure that should be computed.

p

power of the Minkowski distance.

testNA

a logical value indicating whether or not distributions shall be checked for NA values.

unit

type of log function. Option are

  • unit = "log"

  • unit = "log2"

  • unit = "log10"

epsilon

epsilon a small value to address cases in the distance computation where division by zero occurs. In these cases, x / 0 or 0 / 0 will be replaced by epsilon. The default is epsilon = 0.00001. However, we recommend to choose a custom epsilon value depending on the size of the input vectors, the expected similarity between compared probability density functions and whether or not many 0 values are present within the compared vectors. As a rough rule of thumb we suggest that when dealing with very large input vectors which are very similar and contain many 0 values, the epsilon value should be set even smaller (e.g. epsilon = 0.000000001), whereas when vector sizes are small or distributions very divergent then higher epsilon values may also be appropriate (e.g. epsilon = 0.01). Addressing this epsilon issue is important to avoid cases where distance metrics return negative values which are not defined and only occur due to the technical issues of computing x / 0 or 0 / 0 cases.

Value

A single distance value

Examples

P <- 1:10 / sum(1:10)
Q <- 20:29 / sum(20:29)
dist_one_one(P, Q, method = "euclidean", testNA = FALSE)

philentropy documentation built on Nov. 10, 2022, 6:18 p.m.