Description Usage Arguments Details Value Note Author(s) References See Also
AIC.moc
generates a table of \log(Likelihood), AIC, BIC,
ICL-BIC and entropy values along with the degrees of freedom of multiple moc
objects.
logLik
returns an object of class logLik
containing the
\log(Likelihood), degrees of freedom and number of observations.
loglike.moc
computes the \log(Likelihood) of a moc
object evaluated at the supplied parameters values, contrary to
logLik
above which uses the estimated parameter values. It gives
the option to re-evaluate the model in which case the supplied
parameter values are used as new starting values.
entropy
is a generic method to compute the entropy of
sets of probabilities.
The entropy of a set of k probabilities (p_1,…,p_k) is computed as entropy = - Sum_i( p_i * \log(p_i) ), it reaches its minimum of 0 when one of the p_i=1 (minimum uncertainty) and its maximum of \log(k) when all probabilities are equal to p_i = 1/k (maximum uncertainty). Standardized entropy is just entropy/\log(k) which lies in the interval [0,1]. The total and mean mixture entropy are the weighted sum and mean of the mixture probabilities entropy of all subjects. These are computed for both the prior (without knowledge of the response patterns) and the posterior mixture probabilities (with knowledge of the responses).
The default method entropy.default
compute entropy and
standardized entropy of a set of probabilities.
entropy.moc
generates a table containing weighted total and
mean standardized entropy of prior and posterior mixture probabilities
of moc
models.
1 2 3 4 5 6 7 8 9 10 |
object, ... |
Objects of class |
k |
Can be any real number or the string "BIC". |
parm |
Parameters values at which the \log(Likelihood) is evaluated. |
evaluate |
Boolean indicating whether re-evaluation of the model
is desired. If |
The computed value in AIC.moc
is -2*\log(Likelihood) + k*npar.
Specific treatment is carried for BIC
(k = \log(nsubject*nvar)),
AIC (k = 2) and \log(Likelihood) (k = 0).
Setting k = "BIC", will produce a table with BIC, mixture posterior
entropy = - \Sum_i_k( wt[i] * post[i,k] * \log(post[i,k]) )
which is an indicator of mixture separation, df and
ICL-BIC = BIC + 2 * entropy which is
an entropy corrected BIC, see McLachlan, G. and Peel, D. (2000) and
Biernacki, C. et al. (2000).
AIC.moc
returns a data frame with the relevant
information for one or more moc
objects.
The likelihood methods works on a single moc
object:
logLik.moc
returns an object of class logLik
with
attributes df, nobs and moc.name while
loglike.moc
returns a matrix containing \log(Likelihood)
and corresponding estimated parameters with attributes moc.name and
parameters.
entropy.moc
returns a data.frame
with number of groups,
total and mean standardized prior and posterior entropy of multiple
moc
objects. The percentage of reduction from prior to
posterior entropy within a model is also supplied.
Be aware that degrees of freedom (df) for mixture models are usually useless (if not meaningless) and likelihood-ratio of apparently nested models often doesn't converge to a Chi-Square with corresponding df.
Bernard Boulerice <bernard.boulerice.bb@gmail.com>
McLachlan, G. and Peel, D. (2000) Finite mixture models, Wiley-Interscience, New York.
Biernacki, C., Celeux, G., Govaert, G. (2000) Assessing a Mixture Model with the Integrated Completed Likelihood, IEEE Transaction on Pattern Analysis and Machine Learning, 22, pp. 719–725.
moc
, confint.moc
, profiles.postCI
,
entropyplot.moc
, npmle.gradient
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.