matrixmixture | R Documentation |
Clustering by fitting a mixture model using EM with K
groups
and unconstrained covariance matrices for a matrix variate normal or
matrix variate t distribution (with specified degrees of freedom nu
).
matrixmixture(
x,
init = NULL,
prior = NULL,
K = length(prior),
iter = 1000,
model = "normal",
method = NULL,
row.mean = FALSE,
col.mean = FALSE,
tolerance = 0.1,
nu = NULL,
...,
verbose = 0,
miniter = 5,
convergence = TRUE
)
x |
data, |
init |
a list containing an array of |
prior |
prior for the |
K |
number of classes - provide either this or the prior. If this is provided, the prior will be of uniform distribution among the classes. |
iter |
maximum number of iterations. |
model |
whether to use the |
method |
what method to use to fit the distribution. Currently no options. |
row.mean |
By default, |
col.mean |
By default, |
tolerance |
convergence criterion, using Aitken acceleration of the log-likelihood by default. |
nu |
degrees of freedom parameter. Can be a vector of length |
... |
pass additional arguments to |
verbose |
whether to print diagnostic output, by default |
miniter |
minimum number of iterations |
convergence |
By default, |
A list of class MixMatrixModel
containing the following
components:
prior
the prior probabilities used.
init
the initialization used.
K
the number of groups
N
the number of observations
centers
the group means.
U
the between-row covariance matrices
V
the between-column covariance matrix
posterior
the posterior probabilities for each observation
pi
the final proportions
nu
The degrees of freedom parameter if the t distribution was used.
convergence
whether the model converged
logLik
a vector of the log-likelihoods of each iteration ending in the final log-likelihood of the model
model
the model used
method
the method used
call
The (matched) function call.
Andrews, Jeffrey L., Paul D. McNicholas, and Sanjeena Subedi. 2011. "Model-Based Classification via Mixtures of Multivariate T-Distributions." Computational Statistics & Data Analysis 55 (1): 520–29. \doi{10.1016/j.csda.2010.05.019}. Fraley, Chris, and Adrian E Raftery. 2002. "Model-Based Clustering, Discriminant Analysis, and Density Estimation." Journal of the American Statistical Association 97 (458). Taylor & Francis: 611–31. \doi{10.1198/016214502760047131}. McLachlan, Geoffrey J, Sharon X Lee, and Suren I Rathnayake. 2019. "Finite Mixture Models." Annual Review of Statistics and Its Application 6. Annual Reviews: 355–78. \doi{10.1146/annurev-statistics-031017-100325}. Viroli, Cinzia. 2011. "Finite Mixtures of Matrix Normal Distributions for Classifying Three-Way Data." Statistics and Computing 21 (4): 511–22. \doi{10.1007/s11222-010-9188-x}.
init_matrixmixture()
set.seed(20180221)
A <- rmatrixt(20,mean=matrix(0,nrow=3,ncol=4), df = 5)
# 3x4 matrices with mean 0
B <- rmatrixt(20,mean=matrix(1,nrow=3,ncol=4), df = 5)
# 3x4 matrices with mean 1
C <- array(c(A,B), dim=c(3,4,40)) # combine into one array
prior <- c(.5,.5) # equal probability prior
# create an intialization object, starts at the true parameters
init = list(centers = array(c(rep(0,12),rep(1,12)), dim = c(3,4,2)),
U = array(c(diag(3), diag(3)), dim = c(3,3,2))*20,
V = array(c(diag(4), diag(4)), dim = c(4,4,2))
)
# fit model
res<-matrixmixture(C, init = init, prior = prior, nu = 5,
model = "t", tolerance = 1e-3, convergence = FALSE)
print(res$centers) # the final centers
print(res$pi) # the final mixing proportion
plot(res) # the log likelihood by iteration
logLik(res) # log likelihood of final result
BIC(res) # BIC of final result
predict(res, newdata = C[,,c(1,21)]) # predicted class membership
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.