entropy | R Documentation |

Compute information-based estimates and distances.

```
entropy(.data, .base = 2, .norm = FALSE, .do.norm = NA, .laplace = 1e-12)
kl_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12)
js_div(.alpha, .beta, .base = 2, .do.norm = NA, .laplace = 1e-12, .norm.entropy = FALSE)
cross_entropy(.alpha, .beta, .base = 2, .do.norm = NA,
.laplace = 1e-12, .norm.entropy = FALSE)
```

`.data` |
Numeric vector. Any distribution. |

`.base` |
Numeric. A base of logarithm. |

`.norm` |
Logical. If TRUE then normalises the entropy by the maximal value of the entropy. |

`.do.norm` |
If TRUE then normalises the input distributions to make them sum up to 1. |

`.laplace` |
Numeric. A value for the laplace correction. |

`.alpha` |
Numeric vector. A distribution of some random value. |

`.beta` |
Numeric vector. A distribution of some random value. |

`.norm.entropy` |
Logical. If TRUE then normalises the resulting value by the average entropy of input distributions. |

A numeric value.

```
P <- abs(rnorm(10))
Q <- abs(rnorm(10))
entropy(P)
kl_div(P, Q)
js_div(P, Q)
cross_entropy(P, Q)
```

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.