RMoE-package: RMoE: LASSO Regularized Mixture of Experts Models

Description Author(s) References See Also

Description

RMoE is a package containing regularized Mixture of Experts models using the Lasso penalty.

RMoE contains the following Regularized Mixture-of-Experts models:

To learn more about RMoE, start with the vignettes: browseVignettes(package = "RMoE")

Author(s)

Maintainer: Bao-Tuyen Huynh baotuyen.dlu@gmail.com

Authors:

References

Huynh B. T., Chamroukhi F. 2019. Estimation and Feature Selection in Mixtures of Generalized Linear Experts Models. https://arxiv.org/abs/1907.06994.

See Also

Useful links:


fchamroukhi/HDME documentation built on Nov. 4, 2019, 12:37 p.m.