Estimate mixture latent variable model
Description
Estimate parameters in a mixture of latent variable models via the EM algorithm.
Usage
1 
Arguments
x 
List of 
data 

k 
Number of mixture components 
control 
Optimization parameters (see details) 
FUN 
See details below 
type 
Type of EM algorithm (standard, classification, stochastic) 
... 
Additional arguments parsed to lowerlevel functions 
Details
The performance of the EM algorithm can be tuned via the control
argument, a list where a subset of the following members can be altered:
 start
Optional starting values
 nstart
Evaluate
nstart
different starting values and run the EMalgorithm on the parameters with largest likelihood tol
Convergence tolerance of the EMalgorithm. The algorithm is stopped when the absolute change in likelihood and parameter (2norm) between successive iterations is less than
tol
 iter.max
Maximum number of iterations of the EMalgorithm
 gamma
Scaledown (i.e. number between 0 and 1) of the stepsize of the NewtonRaphson algorithm in the Mstep
 trace
Trace information on the EMalgorithm is printed on every
trace
th iteration
Note that the algorithm can be aborted any time (Cc) and still be saved (via on.exit call).
Value
A mixture
object
Author(s)
Klaus K. Holst
See Also
mvnmix
Examples
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15  set.seed(1)
m0 < lvm(list(y~x+z,x~z))
distribution(m0,~z) < binomial.lvm()
d < sim(m0,500,p=c("y<z"=2,"y<x"=1))
## unmeasured confounder example
m < baptize(lvm(y~x));
covariance(m,~x) < "v"
intercept(m,~x+y) < NA
M < mixture(m,k=2,data=d,control=list(trace=1,tol=1e4))
summary(M)
lm(y~x,d)
## True slope := 1
