sparktseung/LRMoE: Logit-Weighted Reduced Mixture-of-Experts

The Logit-Weighted Reduced Mixture-of-Experts (LRMoE) proposed by Fung et al. (2019) is a flexible framework actuarial loss modelling. For more details, see Fung et al. (2019) "A class of mixture of experts models for general insurance: Theoretical developments" published in Insurance: Mathematics and Economics, and Fung et al. (2019) "A CLASS OF MIXTURE OF EXPERTS MODELS FOR GENERAL INSURANCE: APPLICATION TO CORRELATED CLAIM FREQUENCIES" in ASTIN Bulletin: The Journal of the IAA.

Getting started

Package details

MaintainerSpark Tseung <spark.tseung@mail.utoronto.ca>
LicenseGPL-3 + file LICENSE
Version0.2.0
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
install.packages("remotes")
remotes::install_github("sparktseung/LRMoE")
sparktseung/LRMoE documentation built on March 21, 2022, 3:22 a.m.