EntropyGMM | R Documentation |
Compute an estimate of the (differential) entropy from a Gaussian Mixture Model (GMM) fitted using the mclust package.
EntropyGMM(object, ...)
## S3 method for class 'densityMclust'
EntropyGMM(object, ...)
## S3 method for class 'densityMclustBounded'
EntropyGMM(object, ...)
## S3 method for class 'Mclust'
EntropyGMM(object, ...)
## S3 method for class 'data.frame'
EntropyGMM(object, ...)
## S3 method for class 'matrix'
EntropyGMM(object, ...)
EntropyGauss(sigma)
nats2bits(x)
bits2nats(x)
object |
An object of class If a |
... |
Further arguments passed to or from other methods. |
sigma |
A symmetric covariance matrix. |
x |
A vector of values. |
For more details see
vignette("mclustAddons")
EntropyGMM()
returns an estimate of the entropy based on a
estimated Gaussian mixture model (GMM) fitted using the mclust
package. If a matrix of data values is provided, a GMM is preliminary
fitted to the data and then the entropy computed.
EntropyGauss()
returns the entropy for a multivariate Gaussian
distribution with covariance matrix sigma
.
nats2bits()
and bits2nats()
convert input values in nats to
bits, and viceversa. Information-theoretic quantities have different
units depending on the base of the logarithm used: nats are expressed
in base-2 logarithms, whereas bits in natural logarithms.
Luca Scrucca
Robin S. and Scrucca L. (2023) Mixture-based estimation of entropy. Computational Statistics & Data Analysis, 177, 107582. \Sexpr[results=rd]{tools:::Rd_expr_doi("doi:10.1016/j.csda.2022.107582")}
mclust::Mclust()
, mclust::densityMclust()
.
X = iris[,1:4]
mod = densityMclust(X, plot = FALSE)
h = EntropyGMM(mod)
h
bits2nats(h)
EntropyGMM(X)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.