Estimate parameters in a mixture of latent variable models via the EM algorithm.
1 
x 
List of 
data 

k 
Number of mixture components 
control 
Optimization parameters (see details) 
FUN 
See details below 
type 
Type of EM algorithm (standard, classification, stochastic) 
... 
Additional arguments parsed to lowerlevel functions 
The performance of the EM algorithm can be tuned via the control
argument, a list where a subset of the following members can be altered:
Optional starting values
Evaluate nstart
different starting values and run the
EMalgorithm on the parameters with largest likelihood
Convergence tolerance of the EMalgorithm. The algorithm is
stopped when the absolute change in likelihood and parameter (2norm)
between successive iterations is less than tol
Maximum number of iterations of the EMalgorithm
Scaledown (i.e. number between 0 and 1) of the stepsize of the NewtonRaphson algorithm in the Mstep
Trace information on the EMalgorithm is printed on every
trace
th iteration
Note that the algorithm can be aborted any time (Cc) and still be saved (via on.exit call).
A mixture
object
Klaus K. Holst
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15  set.seed(1)
m0 < lvm(list(y~x+z,x~z))
distribution(m0,~z) < binomial.lvm()
d < sim(m0,500,p=c("y<z"=2,"y<x"=1))
## unmeasured confounder example
m < baptize(lvm(y~x));
covariance(m,~x) < "v"
intercept(m,~x+y) < NA
M < mixture(m,k=2,data=d,control=list(trace=1,tol=1e4))
summary(M)
lm(y~x,d)
## True slope := 1

Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.
Please suggest features or report bugs with the GitHub issue tracker.
All documentation is copyright its authors; we didn't write any of that.