Gmm: Gaussian Mixture Model Prior description class

View source: R/model_gmm.R

GmmR Documentation

Gaussian Mixture Model Prior description class

Description

An S4 class to represent a multivariate Gaussian mixture model. The model corresponds to the following generative model:

π \sim Dirichlet(α)

Z_i \sim \mathcal{M}(1,π)

V_k \sim \mathcal{W}(\varepsilon^{-1},n_0)

μ_k \sim \mathcal{N}(μ,(τ V_k)^{-1})

X_{i}|Z_{ik}=1 \sim \mathcal{N}(μ_k,V_{k}^{-1})

with \mathcal{W}(\varepsilon^{-1},n_0) the Wishart distribution. The Gmm-class must be used when fitting a simple Gaussian Mixture Model whereas the GmmPrior-class must be used when fitting a CombinedModels-class.

Usage

GmmPrior(tau = 0.01, N0 = NaN, mu = NaN, epsilon = NaN)

Gmm(tau = 0.01, N0 = NaN, mu = NaN, epsilon = NaN, alpha = 1)

Arguments

tau

Prior parameter (inverse variance) default 0.01

N0

Prior parameter (pseudo count) should be > number of features (default to NaN, in this case it will be estimated from data as the number of columns of X)

mu

Prior parameters for the means (vector of size D), (default to NaN, in this case mu will be estimated from the data and will be equal to the mean of X)

epsilon

Prior parameter co-variance matrix prior (matrix of size D x D), (default to a matrix of NaN, in this case epsilon will be estimated from data and will corresponds to 0.1 times a diagonal matrix with the variances of the X columns)

alpha

Dirichlet prior parameter over the cluster proportions (default to 1)

Value

a GmmPrior-class object

a Gmm-class object

References

Bertoletti, Marco & Friel, Nial & Rastelli, Riccardo. (2014). Choosing the number of clusters in a finite mixture model using an exact Integrated Completed Likelihood criterion. METRON. 73. 10.1007/s40300-015-0064-5.

See Also

GmmFit-class, GmmPath-class

Other DlvmModels: CombinedModels, DcLbm, DcSbm, DiagGmm, DlvmPrior-class, Lca, MoM, MoR, MultSbm, Sbm, greed()

Examples

GmmPrior()
GmmPrior(tau = 0.1)
Gmm()
Gmm(tau = 0.1, alpha = 0.5)

comeetie/greed documentation built on Oct. 10, 2022, 5:37 p.m.