Description Author(s) References See Also
RMoE
is a package containing regularized Mixture of Experts
models using the Lasso penalty.
RMoE
contains the following Regularized Mixture-of-Experts models:
GaussianRMoE: Gaussian Regularixed Mixture of Experts;
LogisticRMoE: Logistic Regularixed Mixture of Experts;
PoissonRMoE: Poisson Regularixed Mixture of Experts.
To learn more about RMoE
, start with the vignettes:
browseVignettes(package = "RMoE")
Maintainer: Bao-Tuyen Huynh baotuyen.dlu@gmail.com
Authors:
Faicel Chamroukhi faicel.chamroukhi@unicaen.fr (0000-0002-5894-3103)
Huynh B. T., Chamroukhi F. 2019. Estimation and Feature Selection in Mixtures of Generalized Linear Experts Models. https://arxiv.org/abs/1907.06994.
Useful links:
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.