LRMoE-package: LRMoE: Logit-Weighted Reduced Mixture-of-Experts

LRMoE-packageR Documentation

LRMoE: Logit-Weighted Reduced Mixture-of-Experts

Description

The Logit-Weighted Reduced Mixture-of-Experts (LRMoE) proposed by Fung et al. (2019) is a flexible framework actuarial loss modelling. For more details, see Fung et al. (2019) "A class of mixture of experts models for general insurance: Theoretical developments" published in Insurance: Mathematics and Economics, and Fung et al. (2019) "A CLASS OF MIXTURE OF EXPERTS MODELS FOR GENERAL INSURANCE: APPLICATION TO CORRELATED CLAIM FREQUENCIES" in ASTIN Bulletin: The Journal of the IAA.

Author(s)

Maintainer: Spark Tseung spark.tseung@mail.utoronto.ca

Authors:

  • Tsz Chai Fung

  • Andrei L. Badescu

  • Sheldon X. Lin

  • Siyi Wei


sparktseung/LRMoE documentation built on March 21, 2022, 3:22 a.m.