Expectation maximization (EM) algorithm for estimating two-component Gaussian mixtures in which all controls are constrained to one component and the cases follow a mixture of the two components (two component constrained model). This is used as an internal method and is called from `bc.twocomp`

.

1 2 3 | ```
em.twocomp.m1(x.all, case.indicator, max.iters = 1000, errtol = 1e-09,
control.comp = 1, start.vals=NULL)
``` |

`x.all` |
vector of cases and controls |

`case.indicator` |
a vector of equal length to x.all with 1's in the case positions and 0's in the control positions |

`max.iters` |
the maximum number of iterations to run |

`errtol` |
Error tolerance level. Approximates convergence of the maximum log likelihood value. |

`control.comp` |
indicator of which component contains the controls (1 or 2) |

`start.vals` |
starting values for the EM algorithm. If |

`max.loglike` |
the maximum log likelihood value for the algorithm |

`mu` |
estimated means for each component |

`sig` |
estimated standard deviations for each component |

`pi` |
estimated proportion of cases in each component |

`n.iters` |
the number of iterations the algorithm took to converge |

`control.comp` |
indicator of which component contains the controls (1 or 2) |

Michelle Winerip, Garrick Wallstrom, Joshua LaBaer

Dempster, Arthur P., Nan M. Laird, and Donald B. Rubin. "Maximum likelihood from incomplete data via the EM algorithm." Journal of the royal statistical society. Series B (methodological) (1977): 1-38.

`bc.binorm`

`bc.twocomp`

`bc.fourcomp`

`em.twocomp.m2`

`em.twocomp.m3`

Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.

All documentation is copyright its authors; we didn't write any of that.