calc.marglogL: Calculate the marginal log-likelihood for a GLMM fitted using...

View source: R/auxilaryfunctions.R

calc.marglogLR Documentation

Calculate the marginal log-likelihood for a GLMM fitted using rpql

Description

After fitting and performing joint (fixed and random effects) using regularized PQL, one may then (for one reason or another) want to calculate the marginal likelihood for the (sub)model, possibly on a test dataset for prediction. This is the main purpose of calc.marglogL.

Usage

calc.marglogL(new.data, fit, B = 1000)
  

Arguments

new.data

A list containing the elements new.data$y, new.data$X, and new.data$Z. These correspond respectively to the responses, fixed effects model matrix, and random effects model matrices that the marginal log-likelihood is be calculated on. No check is made against the elements in fit to ensure that these are of the correct dimensions compared, and furthermore it is assumed that new.data$Z is a list in the same order as the Z used when fitting the original model via rpql.

fit

An object of class pqrl. In the least, fit should be a list containing the elements fit$family for the family, e.g. gaussian(), poisson(), fit$fixef for the estimated vector of fixed effects, fit$ran.cov which is a list of estimated random effects covariance matrices. If appropriate, fit may also contain the elements fit$phi for the estimated variance parameter in normal, lognormal, and negative binomial GLMMs, fit$shape for the estimated shape parameter used in Gamma GLMMs, fit$trial.size for the trial size(s) for binomial GLMMs, and fit$zeroprob for the estimated probability of a structural zero in ZIP GLMMs.

B

A positive integer for the number of random effects examples to generate, when performing Monte-Carlo integration. Defaults to 1000.

Details

Regularized PQL performs penalized joint (fixed and random effects) selection for GLMMs, where the penalized quasi-likelihood (PQL, Breslow and Clayton, 1993) is used the loss function. After fitting, one may then wish to calculate the marginal log-likelihood for the (sub)model, defined as

\ell = \log\left(\int f(\bm{y}; \bm{\beta}, \bm{b}, \phi) f(\bm{b}; \bm{\Sigma}) d\bm{b}\right),

where f(\bm{y}; \bm{\beta}, \bm{b}, \phi) denotes the conditional likelihood of the responses \bm{y} given the fixed effects \bm{\beta}, random effects \bm{b}, and nuisance parameters \phi if appropriate, and f(\bm{b}; \bm{\Sigma}) is the multivariate normal distribution for the random effects, with covariance matrix \bm{\Sigma}. calc.marglogL calculates the above marginal likelihood using Monte-Carlo integration.

Admittedly, this function is not really useful for fitting the GLMM per-se: it is never called by the main function rpql, and the marginal likelihood is (approximately) calculated anyway if hybrid.est = TRUE and the final submodel is refitted using lme4. Where the function comes in handy is if you have a validation or test dataset, and you want to calculated the predicted (log) likelihood of the test data given the regularized PQL fit.

Value

The marginal log-likelihood of new.data given the GLMM in fit.

Warnings

  • No check is made to see if the dimensions of the elements new.data and fit match, e.g. the number of columns in new.data$X is equal to the number of elements in fit$fixef. Please ensure they are!

  • Monte-Carlo integration is computationally intensive especially if \bm{y} is long!

Author(s)

Francis K.C. Hui <francis.hui@gmail.com>, with contributions from Samuel Mueller <samuel.mueller@sydney.edu.au> and A.H. Welsh <Alan.Welsh@anu.edu.au>

Maintainer: Francis Hui <fhui28@gmail.com>

References

  • Breslow, N. E., & Clayton, D. G. (1993). Approximate inference in generalized linear mixed models. Journal of the American Statistical Association, 88, 9-25.

See Also

rpql for fitting and performing model selection in GLMMs using regularized PQL. lme4 also approximately calculates the marginal log-likelihood when fitting a GLMM.

Examples

## Not given

rpql documentation built on Aug. 20, 2023, 1:08 a.m.