The Logit-Weighted Reduced Mixture-of-Experts (LRMoE) proposed by Fung et al. (2019) is a flexible framework actuarial loss modelling. For more details, see Fung et al. (2019) "A class of mixture of experts models for general insurance: Theoretical developments" published in Insurance: Mathematics and Economics, and Fung et al. (2019) "A CLASS OF MIXTURE OF EXPERTS MODELS FOR GENERAL INSURANCE: APPLICATION TO CORRELATED CLAIM FREQUENCIES" in ASTIN Bulletin: The Journal of the IAA.
Package details |
|
---|---|
Maintainer | Spark Tseung <spark.tseung@mail.utoronto.ca> |
License | GPL-3 + file LICENSE |
Version | 0.2.0 |
Package repository | View on GitHub |
Installation |
Install the latest version of this package by entering the following in R:
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.