DHARMa: Residual Diagnostics for Hierarchical (Multi-Level / Mixed) Regression Models

The 'DHARMa' package uses a simulation-based approach to create readily interpretable scaled (quantile) residuals for fitted generalized linear mixed models. Currently supported are generalized linear mixed models from 'lme4' (classes 'lmerMod', 'glmerMod'), generalized additive models ('gam' from 'mgcv'), 'glm' (including 'negbin' from 'MASS', but excluding quasi-distributions) and 'lm' model classes. Alternatively, externally created simulations, e.g. posterior predictive simulations from Bayesian software such as 'JAGS', 'STAN', or 'BUGS' can be processed as well. The resulting residuals are standardized to values between 0 and 1 and can be interpreted as intuitively as residuals from a linear regression. The package also provides a number of plot and test functions for typical model misspecification problems, such as over/underdispersion, zero-inflation, and residual spatial and temporal autocorrelation.

Install the latest version of this package by entering the following in R:
AuthorFlorian Hartig [aut, cre]
Date of publication2017-03-11 00:03:57
MaintainerFlorian Hartig <florian.hartig@biologie.uni-regensburg.de>
LicenseGPL (>= 3)

View on CRAN


benchmarkOverdispersion Man page
benchmarkP Man page
benchmarkUniformity Man page
createData Man page
createDHARMa Man page
DHARMa Man page
DHARMa-package Man page
fitted.gam Man page
if predictor is a factor, a boxplot will be plotted instead of a Man page
plotConventionalResiduals Man page
plot.DHARMa Man page
plotResiduals Man page
plotSimulatedResiduals Man page
print.DHARMa Man page
simulateResiduals Man page
testOverdispersion Man page
testOverdispersionParametric Man page
testSimulatedResiduals Man page
testSpatialAutocorrelation Man page
testTemporalAutocorrelation Man page
testUniformity Man page
testZeroInflation Man page

Questions? Problems? Suggestions? or email at ian@mutexlabs.com.

Please suggest features or report bugs with the GitHub issue tracker.

All documentation is copyright its authors; we didn't write any of that.